Fashion e-commerce teams don’t typically think of their sizing feature as a biometric data processing system. But if the feature involves photos, that may be exactly what it is under GDPR — and the compliance implications are more significant than most legal teams catch before launch.
This is a factual overview of the regulatory situation and a description of the architectural alternative.
What GDPR Article 9 actually says
GDPR Article 9 prohibits the processing of “special categories of personal data” unless specific conditions are met. The special categories include biometric data “for the purpose of uniquely identifying a natural person.”
Biometric data is defined in Article 4(14) as “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person.”
The question for sizing features is whether photos processed by computer vision systems qualify. The answer depends on the processing purpose, but several EU supervisory authorities have taken the position that photographs processed through AI systems to extract body measurements can constitute biometric data — because the technical processing extracts physical characteristics that could be used to identify the individual.
The EU AI Act’s compliance deadline in August 2026 adds a second layer: AI systems processing biometric data may face classification as high-risk AI systems with additional obligations including conformity assessments and technical documentation.
What special category status means in practice
For any data processor handling special category data, the compliance burden is significantly higher than for standard personal data:
Explicit legal basis. Standard consent doesn’t suffice. You need one of the Article 9(2) exceptions — typically explicit consent (which must meet a higher standard than standard consent) or substantial public interest (unlikely for commercial sizing).
Data Protection Impact Assessment (DPIA). Article 35 requires a DPIA before beginning processing that is “likely to result in a high risk.” Photo-based biometric processing qualifies automatically.
Data Processing Agreements (DPAs). Any third-party photo-based sizing provider (3DLOOK, Bodygram, Fit3D, etc.) is a data processor. A GDPR-compliant DPA covering biometric data requires more specific provisions than a standard DPA.
Member state divergence. Some EU member states impose additional restrictions on biometric processing beyond the GDPR baseline — Germany, France, and the Netherlands have all taken stricter positions in various contexts.
Fines. The maximum penalty for Article 9 violations (inadequate legal basis for special category processing) is €20 million or 4% of global annual turnover, whichever is higher.
This doesn’t make photo-based sizing illegal — many companies operate it with appropriate compliance architecture. But the overhead is real, and teams that discover it late (post-launch) face retroactive remediation.
The data minimization angle
Even if legal basis is established, GDPR Article 5(1)(c) requires data minimization: “adequate, relevant and limited to what is necessary.” If the same sizing outcome can be achieved without processing photos, storing photos may be disproportionate to the purpose — which creates a legal risk independent of the legal basis question.
This is the principle that makes architectural alternatives attractive from a legal perspective, not just a technical one.
The statistical alternative: what it trades away and what it keeps
Statistical prediction from height and weight (rather than photos) produces a different risk profile:
What it gives up: Per-individual precision. Photo-based systems that work well produce measurements accurate to within a few centimeters. Statistical prediction from height and weight carries meaningful uncertainty — roughly ±5% on circumference dimensions at 95% confidence.
What it preserves: The sizing outcome. For most sizing applications — recommending a size M or L, generating a fit score for a garment — you don’t need per-millimeter accuracy. You need measurements that are accurate enough to distinguish between size buckets. Statistical prediction from height and weight typically meets this bar.
What it adds: A fundamentally different data category. A number (body_height: 1750) is personal data, not biometric data, because it doesn’t result from “specific technical processing” of physical characteristics. It’s self-reported. The API receives this number, performs inference, and discards it. Nothing is stored. The output — a predicted chest circumference of 940mm — is statistical, not biometric: it describes a population-level estimate for someone with these inputs, not a biometric signature of a specific individual.
This is the architectural argument: a stateless prediction API that accepts only numerical inputs and returns statistical predictions doesn’t process biometric data under GDPR Article 4(14). No photos, no processing of physical characteristics for the purpose of identification.
The user experience argument
Beyond compliance, there’s a conversion argument. In user research and A/B tests across multiple fashion platforms, photo upload requirements for sizing generate significant drop-off — particularly on mobile, from users in shared spaces, and from users who are self-conscious about body image.
A sizing flow that asks for height and weight — questions users encounter on every healthcare form — has lower friction by design. The completion rate difference is meaningful for platforms at scale.
What this looks like in a real onboarding flow
Example scenario: A European fashion retailer wants to add a size recommendation feature to their mobile app without storing biometric data.
The flow:
- User opens the sizing wizard. The only inputs are sex, height (in their preferred unit), and weight (optional).
- The app calls a stateless prediction API server-side — the user’s height and weight never leave the app’s own infrastructure.
- The API returns chest, waist, and hip circumferences as statistical predictions with confidence intervals.
- The app maps these to the retailer’s size chart and recommends a size, storing only the recommendation label.
At no point are photos captured, stored, or processed. The numerical measurements are transient — they exist in the API call and are discarded. The retailer’s database contains only: user ID + recommended size + timestamp.
From a GDPR standpoint, this is a conventional personal data processing scenario (size recommendation), not a special category scenario. The DPIA requirement likely doesn’t apply. The compliance overhead is standard rather than elevated.
Not a legal opinion
This article describes general regulatory considerations, not legal advice. The correct approach to your specific situation depends on your jurisdiction, existing legal basis, the specific technical architecture of your sizing feature, and your DPA with any third-party providers. Review with your data protection counsel.
What the regulatory landscape makes clear is that the architectural choice — photo-based vs. statistical — has compliance consequences, and those consequences favor statistical approaches for teams that want to minimize compliance overhead.