HKMA Raises the Bar on Alternative Data Use in Banking
- TrustSphere Network

- 7 days ago
- 6 min read
The increasing use of alternative data is changing the way banks assess risk, onboard customers, detect fraud, and extend financial services. However, as banks become more reliant on non-traditional information sources, regulators are moving quickly to ensure that innovation does not come at the expense of fairness, transparency, or consumer protection.
The latest guidance from the Hong Kong Monetary Authority (HKMA) signals a clear expectation that banks must strengthen governance, improve explainability, and manage privacy risks as they expand their use of alternative data.
What Is Alternative Data?
Alternative data refers to non-traditional information that falls outside the standard datasets typically used by banks and credit reference agencies.
This can include:
Utility and telecom payment history
E-commerce activity
Mobile wallet transactions
Behavioural and digital footprint data
Social media activity
Web browsing patterns
Geolocation information
Device usage patterns
Transaction data from digital platforms
Unlike traditional credit bureau data, alternative data can provide a more real-time and holistic picture of an individual’s financial behaviour and risk profile.
For customers with little or no formal credit history, such as younger consumers, gig economy workers, migrant workers, or small business owners, alternative data may help improve access to loans, credit cards, insurance products, and other financial services.
The HKMA has previously explored alternative credit scoring for SMEs and broader data-sharing initiatives through the Commercial Data Interchange framework, reflecting the regulator’s longer-term commitment to more data-driven banking models.
Why the HKMA Is Acting Now
The HKMA has recognised that the rapid digitalisation of banking, coupled with advances in AI and machine learning, is accelerating the use of alternative data across the sector.
Banks are increasingly using alternative data for:
Credit risk assessment
Customer onboarding and KYC
Fraud detection
AML monitoring
Customer segmentation
Product recommendations
Collections and recovery strategies
However, the wider use of these data sources also creates new risks. Poor quality data, hidden biases, weak consent mechanisms, or opaque AI models can lead to unfair outcomes, discrimination, privacy breaches, and customer complaints.
The HKMA has therefore introduced a principles-based framework requiring authorised institutions to adopt controls proportionate to the risks involved.
The Four Key Areas of Focus
1. Governance and Accountability
The HKMA has made it clear that responsibility for alternative data usage sits with the board and senior management.
Banks are expected to:
Define clear policies on the purpose and acceptable use of alternative data
Identify which data sources are permitted and under what circumstances
Conduct due diligence on third-party data vendors
Maintain clear documentation on data sources, ownership, and quality
Carry out regular reviews and annual compliance audits
Ensure accountability for AI and data-driven decision making
This is particularly important where third-party fintechs, data aggregators, cloud providers, or AI vendors are involved. The regulator expects banks to retain oversight even when activities are outsourced. Recent HKMA guidance on cloud adoption and fintech partnerships has reinforced the need for stronger governance and cybersecurity controls across outsourced ecosystems.
2. Transparency and Consent Management
One of the most important aspects of the guidance is the emphasis on transparency.
Customers must understand:
What data is being collected
Why it is being collected
How it will be used
Whether it will influence lending, pricing, or other decisions
Whether the data will be shared with third parties
The HKMA expects banks to obtain explicit consent before collecting or using alternative data. Banks must also be able to explain how data-driven models influence decisions.
This is particularly relevant where AI or machine learning is used in credit decisioning. Black-box models that cannot be interpreted or explained may create regulatory risk.
The HKMA has consistently stressed the importance of explainability in AI-driven banking models, including in earlier guidance on big data analytics, AI, and generative AI use cases.
3. Data Quality and Fairness
The use of alternative data creates a major challenge around accuracy, bias, and fairness.
For example:
A customer living in a lower-income district could be unfairly penalised by location-based risk models
Gig economy workers with irregular income patterns may appear riskier than salaried workers
Younger customers with limited traditional credit histories may be excluded if models over-rely on incomplete data
Social media or behavioural indicators could unintentionally discriminate against certain customer groups
The HKMA expects banks to implement clear data validation processes and actively monitor for unfair outcomes.
This includes:
Testing models for bias and disparate impact
Validating that data remains accurate and relevant
Ensuring customers can challenge or appeal decisions
Periodically reviewing whether models continue to perform as intended
Establishing clear thresholds for model governance and escalation
This is likely to push banks towards more formal model risk management frameworks, particularly where alternative data is used in lending or financial inclusion initiatives.
4. Data Privacy and Protection
Alternative data often includes highly sensitive personal information, such as location data, mobile phone activity, browsing habits, purchasing behaviour, and social interactions.
This creates heightened privacy and cybersecurity risks.
Banks using alternative data must comply with Hong Kong’s Personal Data (Privacy) Ordinance (PDPO), which governs the collection, use, processing, and sharing of personal data. The PDPO is principles-based and requires organisations to collect only necessary data, use it fairly, keep it secure, and ensure customers understand how their data will be used.
Banks should therefore ensure:
Strong encryption and access controls
Clear data retention policies
Restrictions on unnecessary or excessive data collection
Strong vendor risk management controls
Robust incident response processes
Independent reviews of privacy and cybersecurity controls
The HKMA’s latest focus on privacy aligns closely with ongoing work by the Hong Kong Privacy Commissioner for Personal Data (PCPD), which continues to emphasise stronger protections around customer data and AI-related use cases.
AI, Alternative Data, and the Future of Banking
The HKMA has highlighted that alternative data is often used alongside AI and machine learning.
This creates significant opportunities for:
Faster onboarding
More accurate fraud detection
Better AML monitoring
Improved customer segmentation
Enhanced lending decisions
Greater financial inclusion
However, it also increases the risk of over-reliance on automated decisions that may be difficult to explain or challenge.
A 2024 HKMA survey found that 75% of retail-focused authorised institutions were already using or planning to use big data analytics and AI. The most common use cases included operational automation, document processing, fraud detection, AML, and credit assessment. The same survey found that 39% of institutions were already adopting or exploring generative AI, primarily for internal use cases such as summarisation, translation, and productivity support.
More recent industry research suggests that adoption continues to accelerate. By 2025, 75% of surveyed financial institutions in Hong Kong had implemented, piloted, or were designing GenAI use cases, with that figure expected to rise significantly over the next few years.
Implications Beyond Hong Kong
The HKMA’s guidance is unlikely to remain unique to Hong Kong.
Around the world, regulators are increasingly scrutinising how financial institutions use AI, data analytics, and alternative data in decision making.
Examples include:
In the European Union, the upcoming EU AI Act will place stricter obligations on high-risk AI systems used in credit scoring and financial services
Under GDPR, firms must provide meaningful information about automated decisions and ensure customers can challenge them
In the United States, regulators such as the Consumer Financial Protection Bureau have warned that lenders cannot avoid explaining adverse decisions simply because they rely on complex algorithms
In Singapore, the Monetary Authority of Singapore has already introduced FEAT principles covering fairness, ethics, accountability, and transparency in AI and data analytics
In Australia, regulators are increasingly focused on responsible AI, privacy, and the use of non-traditional data in consumer lending
In Malaysia, financial institutions are likely to face growing expectations around model governance, data privacy, explainability, and responsible AI as Bank Negara Malaysia continues to expand its guidance in these areas
For multinational banks, this means governance frameworks for alternative data and AI will increasingly need to be global, rather than designed market by market.
What Banks Should Do Next
Banks should not treat the HKMA guidance as a narrow compliance exercise.
Instead, it should be viewed as a catalyst to review broader governance frameworks around data, AI, privacy, model risk, and third-party oversight.
Key next steps include:
Review all current uses of alternative data across the organisation
Identify gaps in governance, documentation, and board oversight
Assess whether customer consent mechanisms are sufficiently clear
Review model explainability and fairness controls
Strengthen third-party vendor due diligence
Align privacy practices with PDPO requirements
Conduct regular testing, monitoring, and independent audits
Ensure AI use cases can be explained to customers and regulators
The message from the HKMA is clear: innovation is encouraged, but only where it is supported by strong governance, transparency, and consumer protection.
Banks that get this balance right will not only reduce regulatory risk, but will also be better positioned to build trust, improve customer outcomes, and unlock the full value of alternative data in a responsible way.




Comments