Introducing Data Localization from mParticle
mParticle customers are now able to choose which country or region their data is stored in order to fulfill policies and compliance requirements.
Data Localization is complex; it requires data about a nations' citizens or residents to be collected, processed, and/or stored inside the country, often before being transferred internationally. This data is then usually transferred only after meeting local privacy or data protection laws, such as giving the user notice of how the information will be used and obtaining their consent. Data localization becomes even more complicated due to evolving guidelines from both government and regulatory bodies, forcing companies to continuously review their internal policies for where data can be stored to comply with data localization requirements.
mParticle helps multi-channel consumer brands evolve and innovate faster by providing a centralized, organized stream of customer data that can be used to shape products, solutions, and experiences for their consumers. For brands looking to connect with customers in regions with strict data residency requirements—like Australia, the UK, and the EU—brands will need to adapt their processes to capture all relevant customer data locally and be able to act on it in real time. With the newly available data localization capabilities, mParticle is now giving global brands more control over where their data is stored to help them comply with local data residency regulations.
Data localization: Your customer data, wherever you need it
As a premium service, mParticle customers now have the option of hosting an instance of mParticle outside the United States in a localized AWS data center of their choice. Once the new mParticle instance is set up, it can be accessed in region-specific data centers at dedicated URLs such as:
Sending data to an mParticle instance in a new region
Customers can continue to send data to mParticle from their applications using our SDKs and Events API. When you use our SDKs, data will be routed automatically by mParticle to the appropriate instances based on the API keys. If you use our Events API to collect data from your backend systems, then you can use region-specific URLs to send that data to mParticle.
Managing customer data in a new region
Companies can gain granular visibility and control into how their data is flowing in/out of an mParticle instance in a particular region with Data Master. Using Data Master's data planning, validation, and quality enforcement features, engineering and product teams can audit the data pipeline in a particular region to identify and resolve issues faster while minimizing the manual labor typically involved. Teams can start by creating a region-specific data plan that defines how customer data is collected, managed, and validated. mParticle also gives you a simple, standard technique for collecting, storing, and applying consent and opt-out choices to show you how to collect individual's consent in a region and apply it to your region-specific data flows.
Sending data from mParticle
There will be data center dependent changes required for data flowing out of mParticle to downstream destinations. mParticle provides recommendations for which AWS pods and S3 buckets to choose for optimal performance here.
mParticle has hundreds of integrations with third-party tools that may or may not have a presence in the same region as the mParticle instance. In those cases, customers will need to have a data strategy for each third-party destination and make sure appropriate measures are taken to comply with regional data residency laws.
You can also personalize on-site and in-app experiences for your regional customers in real time with mParticle’s Profile API. Profile API delivers customer insights directly to your app that can include customized search results, product recommendations, and UI components among other things.
Latest from mParticle
Avoiding the growth trap
What do cattle farmers from the 1600s have in common with teams across modern companies? Both rely on shared resources that can quickly be depleted by an overzealous desire for growth, leading to the tragedy of the commons. Learn how you can avoid the growth trap by leveraging your customer data infrastructure and saving your engineering resources from depletion. Stop the vicious cycle, not the development cycle.
Why real-time data processing matters
Business-critical systems shouldn't depend on slow data pipelines. Learn more about real-time data processing and how implementing it strategically can increase efficiency and accelerate growth.
APIs vs. Webhooks: What’s the difference?
An API (Application Programming Interface) enables two-way communication between software applications driven by requests. A webhook is a lightweight API that powers one-way data sharing triggered by events. Together, they enable applications to share data and functionality, and turn the web into something greater than the sum of its parts.