{"id":32725,"date":"2018-09-27T20:42:01","date_gmt":"2018-09-27T20:42:01","guid":{"rendered":"http:\/\/content.centerforfinancialinclusion.org\/?p=32725"},"modified":"2023-12-13T14:13:29","modified_gmt":"2023-12-13T18:13:29","slug":"getting-data-right","status":"publish","type":"post","link":"https:\/\/content.centerforfinancialinclusion.org\/getting-data-right\/","title":{"rendered":"Getting Data Right"},"content":{"rendered":"
\u201cHow is that possible? They can\u2019t have it,\u201d Parameshwari shakes her head and waves a bangled hand. \u201cWhat is ours will belong to us. We give someone information because they need it. How can they give that to everyone? Why should everyone have my information?\u201d<\/p>\n
Parameshwari sits next to a sewing machine in a sparsely furnished room in Chennai, India, where she earns less than $10 a day, according to a 2017 research collaboration<\/a>,\u00a0Privacy on the Line<\/em>, between Dalberg, Dvara Research and CGAP. She speaks for many consumers around the globe, equally baffled by claims made in the name of \u201cbig data\u201d (watch the short film version of Privacy on the Line<\/em><\/a> for more consumer voices in India). How is data about individuals collected and shared? Do they have any real say in the matter? Are their governments protecting them?<\/p>\n The current passion for data \u2013 its promises and perils \u2013 is everywhere evident in the press and policy making. There is little new about the practice of companies and governments collecting personal information about their customers and citizens. What is new is the enormous scale on which this information is collected and stored (largely in ways which are invisible to the individual concerned); the speed and accuracy of the information which may emerge from tracking individuals in real-time rather than questioning them later; and the extra insights which may be gained by combining information from numerous sources, both public and private. These are the features which make \u201cbig data\u201d big.<\/p>\n Data issues concern everyone, and not only in their financial lives. This essay focuses on how the collection and use of data for the purpose of providing financial services affects consumer\u2019s financial lives and other parts of their lives, with a particular focus on the most vulnerable customers \u2013 the base of the pyramid in frontier and emerging markets.<\/p>\n","protected":false},"excerpt":{"rendered":" \u201cHow is that possible? They can\u2019t have it,\u201d Parameshwari shakes her head and waves a bangled hand. \u201cWhat is ours will belong to us. We give someone information because they need it. How can they give that to everyone? Why should everyone have my information?\u201d Parameshwari sits next to a sewing machine in a sparsely […]<\/p>\n","protected":false},"author":75,"featured_media":32740,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"regions":[],"series":[],"types":[3123],"client":[],"topics":[],"personas":[],"institutional_partnerships":[],"clients":[],"program_teams":[],"acf":{"types":{"term_id":3123,"name":"Blog Post","slug":"blog-post","term_group":0,"term_taxonomy_id":3123,"taxonomy":"types","description":"","parent":0,"count":2202,"filter":"raw"},"header":{"header_type":"post_aligned","post_cover":{"description":""},"post_aligned":{"description":"Opportunities to better deliver services using \"big data\" have grown exponentially the last 10 years, but merely hoping it will not be misused to the detriment of vulnerable customers is naive at best."},"post_default":{"description":"Opportunities to better deliver services using \"big data\" have grown exponentially the last 10 years, but merely hoping big data will not be misused to the detriment of vulnerable customers is naive. "}},"authors":[{"ID":32735,"post_author":"75","post_date":"2018-09-27 17:51:20","post_date_gmt":"2018-09-27 17:51:20","post_content":"","post_title":"Katharine Kemp","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"katharine-kemp","to_ping":"","pinged":"","post_modified":"2018-10-18 19:31:08","post_modified_gmt":"2018-10-18 19:31:08","post_content_filtered":"","post_parent":0,"guid":"http:\/\/content.centerforfinancialinclusion.org\/?post_type=people&p=32735","menu_order":0,"post_type":"people","post_mime_type":"","comment_count":"0","filter":"raw","featured_image":"https:\/\/content.centerforfinancialinclusion.org\/wp-content\/uploads\/sites\/2\/2018\/09\/KKemp-Profile.jpg","acf":{"title":"Lecturer, Faculty of Law, UNSW Sydney","position":"staff","social_media_links":{"email":"","linkedin":"","twitter":""},"body":" Dr. Katharine Kemp is a Lecturer at the Faculty of Law, UNSW Sydney. Her research focuses on competition law (particularly misuse of market power), consumer protection and data privacy in digital financial services. She has published widely in these fields, including two books and numerous peer-reviewed journal articles. She has also practiced as a commercial lawyer in major law firms.<\/p>\n","header":{"header_type":"people_aligned","people_aligned":{"description":""}},"blocks":false,"page_settings":{"":null,"email_sign_up":true,"show_related_content":true,"show_contextual_menu":false,"contextual_menu_cta":null,"replace_global":false,"hide_sticky_share":false,"hide_date_when_featured":false,"is_list_view":false,"premium":false,"preview_image":false,"description":""},"is_author":true},"url":"katharine-kemp"}],"meta_cta":{"download":true,"cta_button_text":"Download","cta_media":"https:\/\/content.centerforfinancialinclusion.org\/wp-content\/uploads\/sites\/2\/2018\/09\/Getting-Data-Right-V2.pdf","cta_url":"","additional_links":false},"blocks":[{"acf_fc_layout":"page_anchor","name":"Enthusiasts vs. Skeptics"},{"acf_fc_layout":"text_block","heading":"Enthusiasts vs. Skeptics","quick_links":false,"heading_label":"","subheader":"","body":" Big data enthusiasts warn that many opportunities to improve lives are wasted when we fail to collect, store and analyse data, or keep information \u201csiloed\u201d, refusing to disclose it to other entities. Big data can reveal a picture of the world previously invisible to human faculties. With the addition of machine learning, computers can decode patterns in this ocean of information, providing answers sometimes even before we know the questions.<\/p>\n These advances may deliver important benefits for financial inclusion. As a first step, data can help identify areas where consumers are not being served by traditional financial services and potentially determine why. Are women not opening or accessing accounts? Are they required to transact through their husbands? Are people from a particular ethnic background disproportionately excluded from insurance?<\/p>\n New ways of using data can help to close identified gaps. While traditional financial institutions may see little value in opening their branches in poorly-served rural areas or lending to small businesses with scant credit history, new players have shown a willingness to serve these consumers and bring rivalry to the market, using alternative sources of information.<\/p>\n A lender might, for instance, use customer location data collected by mobile network operators to offer credit to those with no formal credit history. If, for instance, location data reveals a woman travelled to the same marketplace every day from six in the morning until four in the afternoon for the last three years, that can support her claim that she has operated a food stall in that market for this period.<\/p>\n Lenders have used a broad range of mobile phone data for the purposes of alternative credit scoring, including contacts, geolocation data, apps installed, SMS messages and call logs. In Kenya, for example, Branch has used such data to provide uncollateralised credit to a range of underserved customers. Tala, also operating in Kenya, has found that people who make regular calls to their family are 4 percent more likely to repay their loans. This information is based on Tala\u2019s analysis of customers\u2019 call logs and the content of their text messages, for example, the use of the word \u201cmama\u201d.<\/p>\n Credit providers also use bill payment data, social media data (such as the size of the customer\u2019s network), psychometric testing and e-commerce transactions to predict the likelihood that a borrower will make their repayments on time. The CEO of Zest Finance has gone so far as to say that \u201cAll data is credit data\u201d.<\/p>\n Seeking more data may allow providers to better meet the particular needs of a group of consumers who struggle to use conventional products. Understanding and tracking the seasonal fluctuations in the income of small-hold farmers, for instance, can allow a lender to tailor repayment terms to those cycles.<\/p>\n Both traditional banks and new players have also analysed large-scale data sets to discern patterns that identify and predict fraud. Acting on this information to reject fraudulent transactions reduces costs to providers and customers alike. In a number of countries, alternative data sources permit financial institutions and fintechs to identify customers previously excluded for lack of formal identity documents.<\/p>\n Many argue that we should also collect data beyond that which is necessary to meet our immediate purposes. While the information may not be needed now, in large enough volumes data itself may reveal trends we had not thought to seek. Some lenders require access to all data from a customer\u2019s mobile phone to make an assessment of their creditworthiness. Kabbage indicates that it collects customer data from social media, checking accounts and e-commerce platforms not just at the point of loan origination, but \u201cthroughout the relationship with the borrower\u201d. The new injunction is, \u201cCollect everything.\u201d<\/p>\n Privacy and data protection advocates have expressed serious concerns about these developments. This should come as little surprise. The underlying approaches of \u201cbig data\u201d and data protection are, in a sense, fundamentally at odds with each other. At the heart of data protection is the limitation principle. Data protection says: Collect and use only that information which is necessary to serve the immediate purpose. Big data says: Give us everything you have and we\u2019ll see if we can find a purpose for it.<\/p>\n Is it safe to permit the unrestrained collection and use of data?<\/p>\n"},{"acf_fc_layout":"page_anchor","name":"Harms"},{"acf_fc_layout":"text_block","heading":"Harms","quick_links":false,"heading_label":"","subheader":"","body":" These concerns are not only hypothetical. The collection and use of consumer data by financial services providers has already caused harm in both well-known and relatively obscure instances. Opportunistic providers have exploited customers\u2019 personal information for their own ends with devastating effect. In Kenya, digital lenders published the details of defaulters on Facebook, using public humiliation as a debt collection tactic. In China, lenders have used information about students\u2019 financial hardship to offer loans which are easy to access but include high interest rates and severe penalties for default. There was a spate of suicides among the students when they later defaulted on those loans.<\/p>\n Data changes power in relationships. If a customer surrenders large amounts of personal information to a company, they become vulnerable to numerous forms of intimidation and exploitation which they cannot anticipate or control. An inherently unequal power relationship has become more unequal.<\/p>\n These opportunities for abuse are greater in jurisdictions where there is no general data protection law or responsible lending law, or little effective enforcement of these laws, as is the case in many developing countries. But even with such laws in place the use of personal information often does consumers no favors.<\/p>\n Ryan Calo, Assistant Professor at the University of Washington School of Law, has described the practice of vulnerability-based marketing, which uses personal data to target consumers based on their particular weaknesses. But as Calo points out, in the online environment, companies can also engineer moments of vulnerability by designing the timing, context and interface of an online transaction in a way that creates frailty in that particular individual, influencing the consumer to act against their own best interests.<\/p>\n Providers who engage in such exploitative conduct exist, even on the frontiers of financial inclusion. I have heard the representatives of providers on stage at a conference or summit recite the mantra, \u201cWe would rather ask forgiveness than permission.\u201d<\/p>\n I have heard the representatives of providers on stage at a conference or summit recite the mantra, \u201cWe would rather ask forgiveness than permission.\u201d<\/p><\/blockquote>\n But many providers and organizations are genuinely committed to the pursuit of financial inclusion and customer-centric business models, including the fair treatment of customer information.<\/p>\n Unfortunately, even responsible, well-intentioned players can expose their customers to risks. Here it is necessary to consider the risks of unintended harms and risks that arise directly from the collection of personal information.<\/p>\n The consequences of new types of collection and analysis of information enabled by rapid advances in technology are still being discovered. Many champion the use of algorithmic processing and particularly machine learning to gain insights from big data, and a number of the advances outlined earlier were achieved through such processing. However, researchers have also revealed that these algorithms may discriminate, exclude and produce otherwise inaccurate conclusions to the detriment of consumers.<\/p>\n Sometimes these tendencies are built into the program itself as a result of human bias. Algorithms are used to identify \u201chigh value\u201d and \u201clow value\u201d consumers, presenting greatest risk for those who are already vulnerable and disadvantaged. Algorithms may also produce results which are plainly wrong when the data being processed is unreliable. This is a major issue in some developing countries where studies have shown that large percentages of the data held are inaccurate, incomplete or out of date.<\/p>\n In other cases, machine learning produces its own discriminatory tendencies. The fact of this flawed analysis cannot always be understood, particularly given increasing reliance on \u201cblack box\u201d algorithms which produce results based on their own form of reasoning, not evident to their creators.<\/p>\n These data practices may bring the comfort of scientific terminology, quantitative analysis, and sharp-edged graphs and tables, but this does not make them immune from embedded bias, error and unjust outcomes.<\/p>\n Acknowledging these risks and harms, some argue that we need only be concerned with the misuse of personal information. Collection alone is innocuous; misuse can be identified and addressed. This approach would permit businesses and governments to harvest personal data at will, unconfined by regulation, then determine at a later date how they might use the information and whether the proposed use is lawful and appropriate. This approach is flawed.<\/p>\n The simple act of collecting and storing an individual\u2019s personal information significantly increases the risk of harm to that person. As soon as we collect and store personal information, we increase the \u201cattack surface\u201d – that is, we increase the opportunities for that information to be hacked, stolen or used without authorization. The more data is stored and the longer it is stored, the greater that risk becomes.<\/p>\n These breaches can cause severe harm. Identity theft can cause a lifetime of expense and exclusion for the individual concerned. And harm is not limited to the individual. These events can have drastic consequences for consumer confidence, trust in the company holding their information and trust in financial services more broadly, working in direct opposition to the goals of financial inclusion.<\/p>\n Major breaches of Equifax, Facebook and the US Office of Personnel Management illustrate the reputational harm and losses to corporations and government from data misuse. Bruce Schneier, a security technology expert at Berkman Center for Internet and Society, Harvard University, has long pointed out that, for the firm holding the information, data can be a \u201ctoxic asset\u201d.<\/p>\n Even projects launched with the best intentions are subject to these risks. Taylor gives the example of the Harvard Signal Program on Human Security and Technology which aimed, in part, to identify forensic evidence of alleged massacres in Sudan with the advantage of unprecedented detail from satellite imagery analysis. However, researchers on the program discovered that hostile actors appeared to be hacking into the Harvard systems and using the project\u2019s data and communications to target their enemy.<\/p>\n Information that companies store about a consumer may also be accessed by governments, which do not always have due regard for the rule of law. In the East and West alike, the media has revealed numerous occasions where governments have required companies to surrender information about individuals in secret and without due process.<\/p>\n We should not forget that in some countries it is illegal to express dissent or criticize the government, to practice a certain religion or engage in homosexual activity. Information that seems harmless viewed in isolation \u2013 a person\u2019s transaction history, social media posts or location data \u2013 can reveal highly sensitive information, especially when combined with data from other sources.<\/p>\n The mere collection and storage of information can be profoundly unsafe for the individual concerned.<\/p>\n"},{"acf_fc_layout":"page_anchor","name":"Consent"},{"acf_fc_layout":"text_block","heading":"Consent","quick_links":false,"heading_label":"","subheader":"","body":" How then should we identify the boundaries of fair collection and use? Should we leave the decision to each individual, making data practices dependent upon their consent? One study asked consumers in Uganda whether they would be willing to trade some privacy for access to a loan or a better interest rate. It reported that many consumers were willing to make such a bargain.<\/p>\n Privacy terms have often been viewed as a bargain or trade of this kind. That is, companies propose certain privacy terms to consumers, usually via a privacy notice, and each consumer makes the decision whether to exchange some of their informational privacy for the benefits offered by the company. This approach is said to respect the individual\u2019s freedom and autonomy in making decisions about their informational privacy on the basis of their own privacy preferences.<\/p>\n However, we should be very cautious about drawing conclusions from the fact that consumers living in poverty say that they would give up some of their privacy for money. This is likely to say more about their straitened circumstances and their lack of alternatives than the legitimacy of the supposed bargain.<\/p>\n The nature of privacy policies themselves also frequently prevents consumers from making informed choices, seemingly designed to hide rather than reveal the most relevant or concerning data practices. Policies almost universally begin with reassurances about the company\u2019s concern for privacy, their diligence in protecting the customer\u2019s information, as well as obvious, innocuous uses of customer information. More problematic terms come later, phrased in vague, open-ended language which guard the company against the accusation of unauthorised use without enlightening the consumer.<\/p>\n And as CFI Fellow Patrick Traynor of the University of Florida documented<\/a>, the privacy policies of many traditional and digital financial service providers use language more suited to university graduates than ordinary consumers.<\/p>\n These terms breed incomprehension. Even with the benefit of high literacy rates and education levels, in Australia, around one in five consumers<\/a> believes the existence of a privacy policy means that the company will not share their personal information with another company. Further, these policies are generally presented on a take-it-or-leave-it basis, and use \u201cbundled\u201d consents: that is, they do not provide separate options concerning uses of personal data beyond the immediate purpose of the transaction but require consumers to consent to all specified uses or none.<\/p>\n \u201cSigning terms and conditions is not a matter of choice – it\u2019s something that you have to do because you have no choice,\u201d said interviewee Sanjay as quoted in Privacy on the Line.<\/p>\n \u201cSigning terms and conditions is not a matter of choice – it\u2019s something that you have to do because you have no choice.\u201d<\/p><\/blockquote>\n The informed consent \u2013 or \u201cnotice and choice\u201d \u2013 model of privacy regulation is subject to more fundamental criticisms, which go beyond the deficiencies of privacy policies. This model was developed in the United States in the 1970s at a time when data practices were entirely different from those of today: collection of information was generally visible and actively involved the individual; the cost and difficulty of processing, storing and transferring data naturally reduced its exposure to misuse; machine learning, online monitoring and mobile phone location data did not exist.<\/p>\n Even if the clarity and usefulness of privacy policies are greatly improved, the nature of new data practices and their consequences will make it extraordinarily difficult for consumers to understand the privacy terms companies offer, let alone their consequences: consumers will lack the information necessary to make a rational choice. The mere process of consumer \u201cconsent\u201d may be meaningless.<\/p>\n"},{"acf_fc_layout":"page_anchor","name":"Prioritizing"},{"acf_fc_layout":"text_block","heading":"Prioritizing","quick_links":false,"heading_label":"","subheader":"","body":" A recent study by Dalberg, Dvara Research and CGAP revealed that, even among some of the world\u2019s poorest, privacy is highly valued and protected to the extent that it will not be exchanged for financial incentives.<\/p>\n \u201cCertain kinds of data are not tradeable. Even if you give me a 100% discount, I won\u2019t share my browsing history,\u201d notes Sushma, a customer in Delhi, in the report.<\/p>\n The Omidyar Network conducted a survey of customers of alternative credit scoring services in Kenya and Colombia which revealed that 82 percent of respondents in fact regard their mobile phone calls and texts as private information, and even more private than medical and financial data.<\/p>\n But is the need for basic financial services more pressing than these sensibilities? Where should our priorities lie when there are people in developing countries who have never had the ability to save using a bank account or transfer money digitally to far-away family or insure themselves against misfortune?<\/p>\n The contention that the right to privacy should be subordinated to the economic needs of the poor was considered by the Supreme Court of India last year in the landmark decision of Justice K S Puttaswamy v Union of India<\/em>, where the Court held for the first time that there is a fundamental right to privacy in India. It would be hard to improve on the response of Justice Chandrachud, who delivered the Plurality Opinion:<\/p>\n There is no simple answer to the question of how data can be used safely and fairly to meet financial inclusion objectives.<\/p>\n While there is compelling evidence of potentially very serious risks to vulnerable individuals and groups, we lack certainty about the actual incidence and impact of the relevant harms. It is therefore tempting to suggest that high standards of data protection should be a second-order consideration, a \u201cnice-to-have\u201d which can be addressed once businesses have been persuaded to serve the underserved, free from the burden of extra regulation. Unfortunately, this is a situation where justice delayed would be justice foregone.<\/p>\n Once personal data is collected, disclosed and distributed to numerous parties, there is no retrieving it. The exposure of that data cannot be undone. With current technology, it can be stored and aggregated in perpetuity, throughout the lifetime of the individual concerned. When harms occur, they will undermine customer trust in the very services which might otherwise have improved their lives.<\/p>\n These factors weigh in favour of an approach that deliberately errs on the side of caution and restraint. Such an approach requires that we forego some uses and disclosures of personal data until those uses and disclosures can be safely made, as explained below.<\/p>\n"},{"acf_fc_layout":"page_anchor","name":"Building Trust and Fairness"},{"acf_fc_layout":"text_block","heading":"Building Trust and Fairness","quick_links":false,"heading_label":"","subheader":"","body":" There have recently been a number of positive global developments in data protection which should interest providers concerned to adopt innovative and responsible data practices. In this part, I outline the unfolding events as well as some proposed best practices we can glean from these developments.<\/p>\n In May 2018, the General Data Protection Regulation (GDPR)<\/strong> came into effect in the European Union (EU). The GDPR is intended to intended to create certainty for business and enhance consumer trust, placing additional obligations on organisations processing data in the EU or about individuals in the EU, including improved standards of consent, a right to erasure and very substantial penalties for infringement.<\/p>\n The GDPR has had ripple effects beyond its actual legal application. For instance, some organisations operating across numerous jurisdictions have seen economies of scale in adopting the same systems of data governance in all jurisdictions and adopted the GDPR requirements as the highest applicable standard.<\/p>\n The implementation of the GDPR has also driven an increase in the development and availability of privacy enhancing technologies<\/strong> (PETs) which allow organisations and their customers to use IT to manage privacy in more securely and conveniently. The GDPR has also led to a fresh focus on privacy by design<\/strong>, which makes privacy part of the foundational design requirements for systems, rather than a \u201cbolt on\u201d down the track.<\/p>\n Encouragingly, a number of organisations have recognised the importance of privacy to their customers and compete on privacy quality<\/strong>, including by the use of PETs and privacy by design. Apple, for example, has emphasised just-in-time privacy notifications which give users a choice about providing their data at the moment when an app is attempting to collect that data. Other organizations are using PETs such as transparent, user-friendly, easy-to-navigate online privacy policies and consent interfaces to earn their customers\u2019 trust.<\/p>\n Regulatory developments in the EU have also led to growing consensus on data protection<\/strong> regulation outside the EU. This has been driven in part by the fact that a number of countries wish to ensure their standard of data protection regulation is sufficient obtain an \u201cadequacy assessment\u201d under the GDPR, allowing organizations in those countries to process data about individuals in the EU. Professor Graham Greenleaf, a global privacy expert at the University of New South Wales, believes that a new global standard is actually emerging as a result of the widespread adoption of the standards in line with the Council of Europe data protection Convention 108, a regulation which he explains includes many, but not all, of the GDPR requirements \u2013 a \u201cGDPR-lite\u201d.<\/p>\n Drawing on these developments there are certain data protection principles which providers should be implementing now in the interests of consumers, financial inclusion and their own reputations, whether or not they are currently subject to data protection regulation:<\/p>\n These principles are not only fair to customers and likely to engender greater customer trust and confidence in the provider and financial services. They can also help us to be more rigorous in our assessment of the benefits and limitations of big data and to impose a sensible discipline on our use of data, reducing the risks of liability and reputational harm for data collectors. They may even remind us that there are still other ways of understanding our world. As Greenleaf has argued, \u201cWe must avoid the assumption that only a datafied world is understandable and valuable, and that more data is preferable to better data.\u201d<\/p>\n"}],"page_settings":{"":null,"email_sign_up":true,"show_related_content":true,"show_contextual_menu":true,"contextual_menu_cta":"","replace_global":false,"hide_sticky_share":false,"hide_date_when_featured":false,"is_list_view":false,"premium":false,"preview_image":false,"description":""},"related_content":{"cards":[{"ID":46595,"post_author":"87","post_date":"2024-02-21 18:35:21","post_date_gmt":"2024-02-21 22:35:21","post_content":"Big Data Enthusiasts: \u201cThe number one most important thing for any business \u2026 is data.\u201d<\/h2>\n
Big Data Skeptics: \u201cUnless we look to change course in this sector, the risks and dangers to privacy loom large\u201d<\/h2>\n
Harms from the Use of Personal Information<\/h2>\n
Harms from Collection Alone<\/h2>\n
Is consent the answer to responsible use of data?<\/h2>\n
Prioritizing Data Protection in Emerging Markets and Developing Countries<\/h2>\n
\u201cThe refrain that the poor need no civil and political rights and are concerned only with economic well-being has been utilised through history to wreak the most egregious violations of human rights. \u2026The pursuit of happiness is founded upon autonomy and dignity. Both are essential attributes of privacy which makes no distinction between the birth marks of individuals.\u201d<\/em><\/h6>\n
Global Developments and Lessons for Data Protection<\/h2>\n
\n
Within inclusive financial services, data collection and processing occur under the radar of most consumers, and research suggests that achieving truly informed consent <\/span>is often untenable<\/span>. <\/span><\/blockquote>\r\nWe know that for the 1.4 billion people currently outside the formal financial system, <\/span>lack of trust<\/span><\/a> is a major reason why unbanked adults do not use formal financial services. As part of CFI\u2019s <\/span>workstream on responsible data practices<\/span><\/a>,<\/span> we are tackling a particularly thorny area in building trust with consumers: privacy. Within inclusive financial services, data collection and processing occur under the radar of most consumers, and research suggests that achieving truly informed consent <\/span>is often untenable<\/span><\/a>. Unfortunately, privacy-related harms have become all-too-common, with examples of mishandled data and outright sexual harassment, including <\/span>fake graphics used to intimidate female digital borrowers <\/span><\/a>. These types of incidents only further deplete trust in digital finance.\u00a0<\/span>\u00a0<\/span>\r\n\r\nAt CFI, we are focusing on a \u201cby-design\u201d philosophy across our thematic areas of research. Privacy by Design (PbD) specifically aims to enhance how privacy is integrated into systems and provides an alternative to the prevalent compliance-centric approaches to managing privacy issues. By emphasizing privacy in the initial design of digital products, privacy is transformed into a core aspect of a system rather than serving as a checkbox for regulatory compliance, as it is often the case with legislation such as the European Union\u2019s General Data Protection Regulation (GDPR).\u00a0<\/span>\u00a0<\/span>\r\n\r\nCFI conducted <\/span>a literature review in 2022<\/span><\/a> that found support for Privacy by Design across disciplines including computer science, engineering, UX\/UI and academia but resulted in few concrete and successful case studies. To date, PbD has occupied a liminal space \u2014 where theoretical and conceptual support is strong, but there are limited examples of how private sector companies have practically put PbD into practice. And there are not yet any examples of PbD being used in the inclusive finance sector.\u00a0<\/span>\u00a0<\/span>\r\n\r\nLeveraging by-design thinking, we created the <\/span>Privacy as Product Playbook<\/span><\/a>, a first-of-its-kind knowledge product for the inclusive finance sector that outlines how to integrate responsible data practices and privacy into the design of digital financial products geared for low-income users. The principle is simple: when privacy needs are considered from the beginning of the design process, there are positive benefits for both the business and the consumer \u2014 it can streamline design processes and eliminate costs of retooling, and it takes consumer needs and protection into account while ultimately builds consumer loyalty. However, in practice, implementing a PbD approach can take time and effort and requires the support of both product managers and leadership teams. <\/span>\u00a0<\/span>\r\n\r\n\r\n\r\nOur hope is that the Privacy as Product Playbook will spur inclusive finance product designers and leaders to adopt PbD thinking in their work. The playbook offers a clear guide for product management teams at digital finance companies. As mentioned earlier, a new generation of technologists is crafting the customer experience \u2014 for better or worse \u2014 and product managers have pivotal influence on these technologists. Product managers must strike a balance between creating value for both consumers and the business, which entails not only meeting users' needs but also ensuring a viable path to commercialization.\u00a0<\/span>\u00a0<\/span>\r\n\r\nTo make the introduction of PbD as approachable as possible, the playbook includes a step-by-step guide on how to include PbD throughout the product development\u202flifecycle, interactive worksheets for teams to implement PbD in their work, practical tips on championing PbD within a company, and common\u202f<\/span>privacy traps<\/span> to avoid. <\/span>It emphasizes the importance of crafting a responsible data strategy specifically for the development of new digital financial products and of prioritizing the privacy needs of end users in every important decision. <\/span>The playbook aims to show that privacy can provide crucial benefits to companies and should be seen as a value-add rather than as a compliance cost.\u00a0<\/span>\u00a0<\/span>\r\n\r\nThis playbook represents what we believe is an important thread for the future of responsible finance, where in the throes of digitalization, good practices increasingly hinge on good design. Beyond the PbD playbook, our consumer protection workstream <\/span>has explored deceptive design tactics<\/span><\/a> that impair consumer autonomy. We are currently conducting an experiment on the impacts of positive friction in digital lending, which involves introducing friction thoughtfully at key decision points in the loan application process to enable suitable and intentional loan selection and usage by consumers.\u00a0<\/span>\u00a0<\/span>\r\n\r\nSo this year, as we celebrate Data Privacy Day, we encourage the inclusive finance community to embrace the importance of responsible design and how it can lead to significant improvements in consumer trust in digital financial services.\u00a0<\/span>\u00a0<\/span>\r\n\r\nThe PbD <\/span>playbook and accompanying brief<\/span><\/a> were a result of CFI\u2019s partnership with PayPal\u2019s Global Privacy Team.<\/span>","post_title":"Privacy by Design for Inclusive Finance: Moving from Liminal Space to Concrete Practice","post_excerpt":"","post_status":"publish","comment_status":"open","ping_status":"open","post_password":"","post_name":"privacy-by-design-for-inclusive-finance-moving-from-liminal-space-to-concrete-practice","to_ping":"","pinged":"","post_modified":"2024-01-30 15:31:20","post_modified_gmt":"2024-01-30 19:31:20","post_content_filtered":"","post_parent":0,"guid":"https:\/\/content.centerforfinancialinclusion.org\/?p=46574","menu_order":0,"post_type":"post","post_mime_type":"","comment_count":"0","filter":"raw","featured_image":"https:\/\/content.centerforfinancialinclusion.org\/wp-content\/uploads\/sites\/2\/2024\/01\/iStock-1324502163.jpg","acf":{"types":{"term_id":3123,"name":"Blog Post","slug":"blog-post","term_group":0,"term_taxonomy_id":3123,"taxonomy":"types","description":"","parent":0,"count":2202,"filter":"raw"},"header":{"header_type":"post_aligned","post_cover":{"description":""},"post_aligned":{"description":"This Data Privacy Day, CFI promotes responsible data practices and privacy through the Privacy as Product Playbook, emphasizing the importance of embedding data privacy at the beginning of the design process to better protect consumers."},"post_default":{"description":""}},"authors":[{"ID":26330,"post_author":"1","post_date":"2018-08-20 13:50:31","post_date_gmt":"2018-08-20 13:50:31","post_content":"","post_title":"Alexandra (Alex) Rizzi","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"alexandra-alex-rizzi","to_ping":"","pinged":"","post_modified":"2023-05-08 10:13:32","post_modified_gmt":"2023-05-08 14:13:32","post_content_filtered":"","post_parent":0,"guid":"http:\/\/cfi.accion.flywheelsites.com\/people\/alexandra-rizzi\/","menu_order":0,"post_type":"people","post_mime_type":"","comment_count":"0","filter":"raw"}],"meta_cta":{"download":false,"cta_button_text":"","cta_media":false,"cta_url":"","additional_links":false},"blocks":false,"page_settings":{"":null,"email_sign_up":true,"show_related_content":true,"show_contextual_menu":false,"contextual_menu_cta":null,"replace_global":false,"hide_sticky_share":false,"hide_date_when_featured":false,"is_list_view":false,"premium":false,"preview_image":"https:\/\/content.centerforfinancialinclusion.org\/wp-content\/uploads\/sites\/2\/2024\/01\/PbD_Jan-30.png","description":""}},"url":"privacy-by-design-for-inclusive-finance-moving-from-liminal-space-to-concrete-practice"},{"ID":46548,"post_author":"87","post_date":"2024-01-11 13:10:32","post_date_gmt":"2024-01-11 17:10:32","post_content":"Once relegated to the realms of IT departments, cybersecurity is increasingly becoming an integral <\/span>component of responsible finance<\/span><\/a>. This transformation marks a shift in the inclusive finance sector, recognizing that cybersecurity is no longer just about technology, but a critical pillar of consumer trust and a key enabler of safe, equitable financial inclusion for consumers and small businesses.<\/span>\r\n
While the digitalization of finance expands access and opportunities, it also exposes people to heightened financial risks, including fraud and predatory practices.<\/span><\/blockquote>\r\nIn large part due to the widespread adoption of digital financial services, the <\/span>Global Findex 2021<\/span><\/a> highlighted a significant milestone: 76 percent of the global population now has an account at a financial institution. While the digitalization of finance expands access and opportunities, it also exposes people to heightened financial risks, including fraud and predatory practices. <\/span>Research<\/span><\/a> shows an escalation in consumer protection risks accompanying the rise in digitalization, including cyber threats. It is crucial for businesses and policymakers committed to inclusive finance to acknowledge the immediate need for robust cybersecurity measures.<\/span>\r\n\r\nCFI\u2019s approach to consumer protection focuses on reducing consumer vulnerabilities, whenever they arise. These vulnerabilities can be classified using a <\/span>vulnerability framework<\/span><\/a>, and help identify who is least equipped to deal with the shock. Cyber threats, due to their nascent and dynamic nature, can create a complex intersection of vulnerabilities, affecting those who are most at risk. In this context, cybersecurity measures implemented by institutions and raising awareness among users are both equally important steps to safeguard individuals and institutions and prevent systemic risks.<\/span>\r\n
Risks of <\/span>Cyber Threat<\/span>s in Inclusive Finance<\/span><\/span><\/h1>\r\nThe landscape of cyber threats has grown substantially more complex. We\u2019re now facing a wide array of dangers, including <\/span>ransomware<\/span><\/a>, <\/span>phishing<\/span><\/a> and its derivatives like <\/span>Smishing<\/span><\/a>, <\/span>vishing<\/span><\/a>, and <\/span>Qshing<\/span><\/a>, <\/span>Distributed Denial of Service (DdoS)<\/span><\/a> attacks, and supply chain infiltrations. These threats expose critical vulnerabilities that can disrupt operations, undermine the integrity of financial institutions, compromise sensitive data, and significantly erode consumer trust. Such risks not only threaten security but also contribute to driving consumers away from the digital economy.<\/span>\r\n\r\nThe increasing prevalence and sophistication of cyber threats are highlighted by a series of high-profile incidents across the globe. <\/span>In 2016,<\/span> in an <\/span>attack on the Central Bank of Bangladesh<\/span><\/a>, cyber-threat actors attempted to steal nearly $1 billion from a Federal Reserve Bank of New York account that belonged to the Bangladeshi central bank; however, the attack was thwarted and losses were largely minimized. Years later, in August 2023, <\/span>the Central Bank of Bangladesh received threats<\/span><\/a> that caused them to halt several internal online services to prevent another potential cyberattack. A DDoS attack targeting a Ukrainian investment company<\/a> led to severe disruptions in website connectivity, and the cyber incidents plaguing Uganda's largest mobile money networks, MTN and Airtel, resulted in a crippling four-day halt in service transactions.<\/span>\r\n
Cyber threats expose critical vulnerabilities that can disrupt operations, undermine the integrity of financial institutions, compromise sensitive data, and significantly erode consumer trust.<\/span><\/blockquote>\r\nIn addition to high-profile cases, many cyber threats targeting fintechs and small financial institutions go unnoticed in the media, but their impact is significant. <\/span>For instance, in Africa, the financial sector is increasingly recognizing cybercrime as a <\/span>major risk<\/span><\/a>. In a notable case, the Bluebottle cybercrime group's targeted attacks against financial institutions in <\/span>Francophone African countries<\/span><\/a> have caused financial losses totaling millions over four years, using methods that are accessible and less sophisticated. In Latin America, the situation is equally concerning. The region saw an estimated 137 billion <\/span>cyber attack attempts<\/span><\/a> in just the first half of 2022, with ransomware being a prevalent threat. SMEs often lack comprehensive security measures and have become prime targets as they digitalize.<\/span>\r\n\r\nThe diversity of cybersecurity incidents and their associated risks\u2013 including fraud, data misuse, transparency deficits, and a lack of resilience mechanisms \u2013 have a direct, negative impact on inclusive finance efforts, affecting both financial institutions and consumers. Findings from the <\/span>Global COVID-19 FinTech Market Rapid Assessment Study<\/span><\/a> indicate a rapid escalation in the perception of cyber risks across surveyed financial products which erodes trust, an already <\/span>scarce feature of financial services<\/span><\/a>. Data breaches are taking <\/span>longer to identify and contain<\/span><\/a>, and are only likely to increase with <\/span>AI-powered attacks<\/span><\/a>. Furthermore, low digital literacy among users leads malicious actors to take advantage of consumers. <\/span>Rural DFS users<\/span><\/a> have been systematically targeted through deceptive calls and messages, coercing fund transfers for false overpayments. <\/span>In Kenya, the rise of mobile banking has significantly increased the number of <\/span>fraudulent<\/span><\/a> actors and cyber criminals since 2016.<\/span>\r\n\r\nThese increased attacks on lower socioeconomic groups living in rural areas are largely due to two factors:<\/span>\r\n
\r\n \t
The lack of affordable, secure hardware and software places lower socioeconomic groups at heightened risk; and<\/span><\/h4>\r\n<\/li>\r\n \t
Cybercriminals have capitalized on the confusion that often surrounds regulations related to cybersecurity specifically and digital services more broadly.<\/span><\/h4>\r\n<\/li>\r\n<\/ol>\r\nThis second issue is exemplified by incidents in <\/span>Ghana<\/span><\/a> where public unawareness of tax collection mechanisms facilitated fraudulent account information collection. These cyberattacks, beyond the immediate financial repercussions, erode trust in financial institutions and the expanding digital economy. Moreover, attacks targeting vulnerable communities ripple through the interconnected financial system, posing a systemic risk.<\/span>\r\n
What Is Needed to Fight Cyber Threats in DFS<\/span><\/span><\/h1>\r\nThe evolving DFS landscape calls for a proactive and multi-faceted approach to strengthen cybersecurity and protect users from evolving cyber threats. To date, several initiatives and strategies have emerged to address these challenges. In Africa, <\/span>USAID<\/span><\/a> and the Federal Trade Commission are collaborating to bolster an enabling environment for consumer protection in the African digital economy. Though its primary focus lies beyond cybersecurity, the effort aims to reinforce regulations and capacity building for authorities. Additionally, CGAP launched the <\/span>DFS Consumer Protection Laboratory<\/span><\/a> which works on cooperative approaches to combat DFS fraud and champion a more consumer-centric approach.<\/span>\u00a0<\/span>\r\n
Consumer protection by design involves thinking of consumer protection at the time of designing products and services, and not as an afterthought.<\/span><\/blockquote>\r\nHowever, more work is needed to safeguard people from cyber harms that risk excluding them from the benefits of the digital economy. Adopting a human-centric approach to cybersecurity could help address the issues of the most vulnerable. Consumer protection by design, an approach championed by CFI, akin to <\/span>privacy by design<\/span><\/a> and <\/span>secure by design<\/span><\/a>, involves thinking of consumer protection at the time of designing products and services, and not as an afterthought. This means taking proactive measures to safeguard users\u2019 financial well-being in an increasingly digitalized financial landscape, and collaborating with multiple stakeholders- regulators, product and service designers and funders to develop protection-by-design principles that can be integrated into the design of financial systems.<\/span>\r\n\r\nOne example of this type of collaboration is the <\/span>CyberPeace Builder\u2019s Program<\/span><\/a> which fosters cooperation and addresses cybersecurity issues in digital finance. This collaborative approach, underscored by the involvement of NGOs like <\/span>Bridges to Development<\/span><\/a> which aligns charitable investments with cybersecurity goals, signifies a collective stride toward bolstering cyber resilience and fostering a secure digital financial landscape for all users.<\/span>\r\n
Conclusion<\/span><\/span><\/h1>\r\nHuman-centric approaches have long been touted as a path to build greater customer centricity when designing financial services. However, their use in the field of cybersecurity is relatively nascent. The increasing complexity of digital financial services and the pace driving inclusion demands a collaborative, human-centered approach to address growing risks. We need collective action to address the risks that will emerge as we traverse this digital frontier \u2013 the responsibility lies on all of us to ensure the outcomes of responsible digital finance.<\/span>","post_title":"Cybersecurity: A Crucial Ingredient for Responsible Finance and Consumer Protection","post_excerpt":"","post_status":"publish","comment_status":"open","ping_status":"open","post_password":"","post_name":"cybersecurity-a-crucial-ingredient-for-responsible-finance-and-consumer-protection","to_ping":"","pinged":"","post_modified":"2024-01-30 11:09:57","post_modified_gmt":"2024-01-30 15:09:57","post_content_filtered":"","post_parent":0,"guid":"https:\/\/content.centerforfinancialinclusion.org\/?p=46548","menu_order":0,"post_type":"post","post_mime_type":"","comment_count":"0","filter":"raw","featured_image":"https:\/\/content.centerforfinancialinclusion.org\/wp-content\/uploads\/sites\/2\/2024\/01\/iStock-1422766384.jpg","acf":{"types":{"term_id":3123,"name":"Blog Post","slug":"blog-post","term_group":0,"term_taxonomy_id":3123,"taxonomy":"types","description":"","parent":0,"count":2202,"filter":"raw"},"header":{"header_type":"post_aligned","post_cover":{"description":""},"post_aligned":{"description":"Cybersecurity has become a critical pillar of consumer trust and a key enabler of equitable financial inclusion. In collaboration with CyberPeace Institute, we explore the risks of cyber threats in inclusive finance and what is needed to responsibly address them."},"post_default":{"description":""}},"authors":[{"ID":46549,"post_author":"87","post_date":"2024-01-10 16:15:30","post_date_gmt":"2024-01-10 20:15:30","post_content":"","post_title":"Francesca Bosco","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"francesca-bosco","to_ping":"","pinged":"","post_modified":"2024-01-10 16:15:30","post_modified_gmt":"2024-01-10 20:15:30","post_content_filtered":"","post_parent":0,"guid":"https:\/\/content.centerforfinancialinclusion.org\/?post_type=people&p=46549","menu_order":0,"post_type":"people","post_mime_type":"","comment_count":"0","filter":"raw"},{"ID":44414,"post_author":"87","post_date":"2022-05-02 10:12:27","post_date_gmt":"2022-05-02 14:12:27","post_content":"","post_title":"Edoardo Totolo","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"edoardo-totolo","to_ping":"","pinged":"","post_modified":"2022-05-02 15:40:32","post_modified_gmt":"2022-05-02 19:40:32","post_content_filtered":"","post_parent":0,"guid":"https:\/\/content.centerforfinancialinclusion.org\/?post_type=people&p=44414","menu_order":0,"post_type":"people","post_mime_type":"","comment_count":"0","filter":"raw"}],"meta_cta":{"download":false,"cta_button_text":"","cta_media":false,"cta_url":"","additional_links":false},"blocks":false,"page_settings":{"":null,"email_sign_up":true,"show_related_content":true,"show_contextual_menu":false,"contextual_menu_cta":null,"replace_global":false,"hide_sticky_share":false,"hide_date_when_featured":false,"is_list_view":false,"premium":false,"preview_image":"https:\/\/content.centerforfinancialinclusion.org\/wp-content\/uploads\/sites\/2\/2024\/01\/Cybersecurity-Preview.png","description":""}},"url":"cybersecurity-a-crucial-ingredient-for-responsible-finance-and-consumer-protection"},{"ID":46537,"post_author":"87","post_date":"2023-12-19 12:30:54","post_date_gmt":"2023-12-19 16:30:54","post_content":"As the inclusive finance sector evolves with a heightened focus on technology and climate, so too do the concepts and ideas that most captivate our attention. Over the course of 2023, emerging terms such as digital public infrastructure (DPI), green inclusive finance, and generative AI, all unknown to the sector just a couple of years ago, have swiftly moved to the forefront of the most crucial discussions in inclusive finance today.\r\n\r\nYet, as some topics rise to prominence, others, although equally important, are often pushed to the sidelines. Unfortunately, gender often falls victim first; as Melinda French Gates points out<\/a>, \u201cWhen the global agenda gets crowded, gender equality is one of the first items to fall off.\u201d <\/em>A distressing picture painted by the UN Women's recent report reveals<\/a> that more than 340 million women and girls could be living in extreme poverty by 2030.\r\n\r\nAnd then there are agendas that have always been right in front of our eyes, but that somehow never seem to get enough attention. The concept of \"stability entrepreneurs,\" as revealed in the Small Firm Diaries<\/a>, has long been part of our collective knowledge, but is often overshadowed by the thrilling pursuit of tech disruptions and unicorns. Focusing on \"stability entrepreneurs\" could reignite the important, but recently overlooked, conversation on resilience in micro and small enterprises<\/a>.\r\n\r\nIt is critical that those of us in the development sector balance our chase for groundbreaking advancements with the essential needs and realities of the millions of people whom we aim to serve. We must remember not to allow the pursuit of market-centric glory to sidetrack our real focus:\r\n
Are we truly leveraging finance to fight global poverty, or just repackaging old business models in new, more appealing wrappers?<\/strong><\/h4>\r\nThis reflection is essential as we introduce and prioritize new concepts for next year. Here are three new concepts that I believe warrant the heightened focus as we go into 2024.\r\n
The Digital Public Infrastructure Debate: Frank Conversations Beyond the Hype<\/strong><\/h1>\r\n2023 saw a dramatic shift in the discourse surrounding digital public infrastructure (DPI). Central to the DPI debate is the \u2018India Stack,\u2019 a multifaceted, multi-layered, and fast-growing tech infrastructure. In particular, the Unified Payment Interface (UPI) model in India, has received both accolades and criticism this year.\r\n
Inefficiencies are already emerging, and addressing them promptly and adopting more participatory approaches with both citizens and the private sector will be essential.<\/blockquote>\r\nDPI is a series of building blocks for digital transactions that the state develops for foundational systems (like UPI), much like government-built roads used by both private and public transport. DPIs help to facilitate initiatives like the Open Network for Digital Commerce<\/a> (ONDC) in India for small businesses that are not included in large digital platforms. As DPI becomes front and center in development, it's becoming increasingly likely that we'll encounter many questions and challenges related to the role and incentives associated with state-backed and state-run initiatives. Inefficiencies are already emerging, and addressing them promptly and adopting more participatory approaches with both citizens and the private sector will be essential.\r\n\r\nThis year, CFI will be spearheading global discussions around DPIs through its work with the Responsible Finance Forum (RFF)<\/a>. Mid-year 2024, RFF will have a key advisory group meeting in Brazil alongside the G20 meeting of the GPFI. While there, critical questions we hope to address include: What are models of DPI that are emerging, and what are the pros and cons associated with each? And what is the long-term viability of each model?\r\n
The Data Economy: Rethinking Data Governance Beyond Open Banking<\/strong><\/h1>\r\nAs we delve into the intricacies of DPI, another dimension comes into play: the governance of the data economy which is closely tied to but separate from the conversations around DPI.\r\n\r\n'India Stack', the most cited in the context of DPI, includes several layers \u2013 the Data Empowerment and Protection Architecture (DEPA)<\/a> layer being the most recent and also the least understood. The DEPA layer encompasses critical elements of data exchange and protection. It is still unknown whether this component of India's DPI will replicate UPI's success, or if alternative models to data governance, such as the European Union\u2019s \u201ccommon data spaces<\/a>\u201d \u2014 where open finance is interlinked with eight other real economy data spaces \u2014 or Australia\u2019s \u201cconsumer data right<\/a>\u201d approach, will emerge as global exemplars.\r\n\r\nFrom a financial inclusion perspective, individuals who lack data trails are left out of open banking. This holds true for account aggregators in India as well; the underbanked or excluded simply don't have data to contribute, meaning that they will continue to be left out of formal banking systems. It's time for a strategic shift towards data exchanges that mirror the economies of the poor. We need to focus on sectors like telecommunications, agriculture, mobility, small trade, and utilities. Today, although these areas offer crucial data points for exchange, they still remain conspicuously absent from the scope of current open banking.\r\n\r\nHeading into 2024, it's essential to overcome compartmentalized strategies in the data economy. A key goal should be to connect open banking with the broader data landscape. Emphasizing learning and policy development, we should draw on global experiences to shape inclusive data governance frameworks. This involves engaging in policy dialogues and adapting lessons from various models to meet the evolving needs of the digital economy and its diverse participants.\r\n
The Great Reckoning: Fintech's Limits in Addressing Climate Crisis<\/strong><\/h1>\r\nWe would be remiss to end a blog on 2023 without speaking of climate. COP28 not only emphasized the urgent environmental realities but also agreed on a significant shift away from fossil fuels<\/a>. This decision underscores the immediate challenges we face: the potential displacement of millions due to climate change effects is no longer a distant threat, but rather a present-day challenge.\r\n\r\nDespite the Intergovernmental Panel on Climate Change (IPCC) issuing warnings about environmental migration since the early nineties<\/a>, disentangling the complex web of social, political, economic, and now increasingly, climatic factors driving migration is challenging<\/a>. However, it's evident that climate change is more and more becoming a significant push factor. The Internal Displacement Monitoring Centre is telling: annually, tens of millions are displaced by natural disasters, often surpassing those displaced by conflicts. In 2022 alone, over were recorded due to disasters, with a significant impact on the world's poorest regions.\r\n
As we navigate this landscape, it's crucial that we align our financial strategies and innovations with the real and pressing needs of the most vulnerable populations.<\/blockquote>\r\nThis reality amplifies the urgency discussed at COP28, where it became clear that fintech alone is insufficient to tackle the climate crisis. The establishment of the Loss and Damage Fund<\/a>, aimed at aiding climate-vulnerable developing countries and backed by countries like Germany and the UAE, marks progress but also underlines the significant gap between global needs and the actions taken. The focus must now extend beyond traditional financial solutions to address the needs of those most affected by climate change, as underscored in CFI's Green Inclusive Finance framework<\/a>.\r\n\r\nDeveloping and emerging nations are on the frontline of climate impact, grappling not just with the environmental fallout but also with the human cost of displacement and migration. As we navigate this landscape, it's crucial that we align our financial strategies and innovations with the real and pressing needs of the most vulnerable populations.\r\n