Title *
Email *
Control * AS.AM1: Assess and map data processing systems, products, services and/or devices. AS.AM1: Assess and map data processing systems, products, services and/or devices. AS.AM2: Assess and map the data processors and data controllers (e.g. the organization or third parties such as service providers, partners, and developers) along with their roles and responsibilities. AS.AM3: Assess and map data owners and individuals (e.g. children, elderly, and persons with disabilities) whose data are being processed. AS.AM4: Assess and map activities of systems, products, services and/or devices processing data. AS.AM5: Assess and define the purposes for data processing activities. AS.AM6: Categorize the types of data involved in data processing activities. AS.AM7: Assess and identify the data processing environment (e.g. geographic location, cloud). AS.AM8: Data protection impact assessment – assess and map data processing activities to illustrate the data actions and associated data elements for systems, products, services, devices, and third parties. AS.AM9: Assess the statutory and regulatory climate(s) affecting the individuals and organization. AS.RA1: Assess and identify human and societal risks that potentially impact individual privacy including user generated contents (UGC) (e.g. content moderation for political use of data, misinformation, deepfakes, cyberbullying, hate speech, malicious bots, sex trafficing, terrorism and other nefarious online behavior). AS.RA2: Assess and identify informational and financial risks that potentially impact individual privacy (e.g. social engineering, tracking payment, and expenditure as well as other privacy related risks). AS.RA3: Assess and identify contextual factors and data activities (e.g. individuals’ demographics and privacy interests or perceptions, data sensitivity and/or types, visibility of data processing to individuals and third parties). AS.RA4: Identify and evaluate datasets and input/output mechanisms for biases against humans, specifically vulnerable populations and underrepresented communities. AS.RA5: Assess, determine and prioritize risks related to problematic data collection and processing activities by establishing likelihood and impact to individual privacy. AS.RA6: Assess and map risks related to the entire data lifecycle from collection to destruction of data. AS.RA7: Risk responses are identified, prioritized, and implemented. AS.RA8: Establish an interdisciplinary cross-functional internal privacy task force or working group, including members of legal, government relations, IT/IS, public relations/marketing communications, and other relevant groups to assess and communicate privacy risks within the organization. AS.RA9: Assess physical and mental health risks and other adverse effects caused by excessive use of XR and spatial computing. AS.RA10: Assess physical safety and other risks that could result in irreversible harm to humans (e.g. photo-sensitivity warning, altered vision). IN.CX1: Communicate transparently policies, processes, and procedures for data processing purposes, practices and associated risks. IN.CX2: Establish roles and responsibilities (e.g. public relations) for communicating data processing purposes, practices, and associated privacy risks. IN.CX3: Establish additional mechanisms (e.g. notices, internal or public reports) for communicating data processing purposes, practices, and associated privacy risks beyond just the privacy policy. IN.CX4: Estabish mechanism to provide accurate and necessary information to the individuals and orgaizations anytime the XR or spatial computing environment has been altered from its original state ( e.g changes in the city map). IN.CX5: Communicate clearly labeled policies, processes, and practices in regards to spatial/XR data and creations (e.g. human readable contracts). IN.CX6: Establish mechanisms for obtaining feedback from individuals (e.g. surveys or focus groups) about data processing and associated privacy risks. IN.CX7: Establish and communicate clearly labeled "community guidelines" and "code of conduct". IN.CX8: Inform users and orgnaizations on the physical and mental health risks and other adverse effects caused by excessive use of XR and spatial computing. IN.CX9: Individulas and organizations must be made aware of risks that could result in irreversible harm ( photo-sensitivity warning, altered vision). IN.CX10: Establish the scope of physical safety considerations as well as the shared responsibility (e.g. distraction while driving, physicial access to XR device). IN.CX11: Clearly Inform individuals and organizations on legal risks resulting from the collection of data (e.g. murder, crime evidence). IN.CX12: Inform individuals and organizations whenever data is processed for automated decision making and profiling (e.g. targeted ads, marginalized communities). IN.CX13: Inform the individual whenever the records of sensitive data is disclosed and shared with additional third parties or vendors. IN.CX14: Inform individuals or organizations of any data corrections or deletions made in the data processing ecosystem (e.g. data sources). IN.CX15: Notify and Inform impacted individuals and organizations following any privacy breach or event. IN.CX16: All policies must be made accessible and understandable in a variety of languages, formats and mediums (e.g. multi-sensory , multilingual, close-caption) IN.CX17: On request service providers must inform users how, where and why data is collected, stored, and shared. IN.CX18: Establish data protection controls to ensure that the users are aware of how long each data processor will retain the data. (e.g. data retention settings). IN.CX19: Establish clear guidelines to indicate public, private and restricted areas so that inviduals and organization can make an infomed decisions on whether they are allowed to Augment/ scan a specific area. (Universal Standard DNC/DNA rule book ). IN.CX20: Clearly Inform individuals and organizations on legal risks resulting from the collection of data (e.g. murder, crime evidence). IN.CX21: Establish mechanisms in place to learn and communicate to individuals and organizations anytime the XR or spatial computing environment has been altered from its original state ( e.g changes in the city map). IN.CH1: Facilitate options (e.g. notices, internal or public reports) for individal and organization to effecively make risk-based decisions around data processing activities. (beyong just privacy policy) IN.CH2: Provide individuals and organizations indicators to ensure others' privacy rights are not violated while making use of XR devices (e.g blinking LED light while recording is taking place). IN.CH3: Provide additional options for enabling individuals’ data processing preferences and requests beyond just a privacy policy. IN.CH4: Facilitate meaningful and informed consent mechanism before collecting user data in any form. IN.CH5: When possible, build a mechanism to gather consent from bystandards whose data is being collected. IN.CH6: Facilitate privacy risk mitigation mechanisms (e.g. credit monitoring, consent withdrawal, data alteration, or deletion) to address impacts of problematic data actions of individuals and organizations. IN.CH7: Facilitate mechanisms provide options avalable the individuals and orgaizations if anytime the XR or spatial computing environment is altered from its original state ( e.g changes in the city map). IN.CH8: Provide clear opt-out mechanisms to allow individuals and organizations to quickly and easily opt-out potential use of their property from digital objects and virtual overlays . IN.CH9: Establish additional Mechanisms (e.g., notices, internal or public reports) for communicating data Collection purposes, practices, associated privacy risks (beyond just the privacy policy for consent mechanism) IN.CT1: Inform the individuals and organization for how long the data will be retained by each data processors.(e.g. share data retention policies). IN.CT2: Establish reasonable retention policies for all collected data such that once it's served it's purpose for collection it is deleted. IN.CT3: Establish mechanisms so that data collected on bystandards is anonymized/deidentified where a consent mechanism is unavailable. (e.g., deletion requirements, scrubbing/blurring sensitive data like faces from datasets, and creating ways for users/bystanders to express their preferences. ) IN.CT4: Communicate the universal standards clearly identifying the permissions to augment, scan, and/or record (Universal DNS/DNA governing bodies to indicate public, private and restricted areas). IN.CT5: All policies must be made accessible and undertandable in a variety of languages, formats and mediums (e.g.multi-sensory, multilingual). IN.CT6: Maintain records of data disclosures and sharing with any vendors or third parties for later review. IN.CT7: Establish mechanisms to provide control over actions an individuals and orgaizations could take, anytime the XR or spatial computing environment is altered from its original state ( e.g changes in the city map). IN.CT8: Establish mechanisms to ensure others' privacy rights are not violated while making use of XR devices (e.g blinking LED light while recording is taking place). IN.CT9: Inform individuals using avalable mechanisms of physical safety considerations as well as the shared responsibility (e.g. popup for assuming responsibility for distraction while driving, physicial access to XR device). IN.CT10: Establish mechanisms to indicate and ensure others' privacy rights are not violated while making use of XR devices (e.g blinking LED light while recording is taking place). IN.CT11: Establish mechanism beyond to empower individuals and organizations to exercise their data processing preferences. ( preference over sharing photo album) IN.CT12: Maintain data provenance and lineage for review to ensure data integrity and ownership. IN.CS1: Establish mechanisms to create awareness and inform parents/guardians regarding privacy risks to minors. IN.CS2: Establish mechanisms to create and increase awareness among minors around privacy risks (e.g. cyber stranger danger). IN.CS3: Establish and communicate "easy to understand" policies, processes, and procedures that affect impact minors's privacy. IN.CS4: Establish and communicate classifications, ratings, and labels for age appropriate contents (e.g. MPAA style rating). IN.CS5: Inform and emphasize that minor's exposure to XR and spatial computing may have unintended consequences on human brain development due to lack of research and insufficient data to establish facts. IN.CS6: Inform individuals of parental control mechanisms to mitigate online harm and help maintain minor's privacy. IN.CS7: Establish and maintain special provisions to Inform minors anytime the XR or spatial computing environment has been altered from its original state. MN.AT1: The workforce is informed and trained on its roles and responsibilities and how they impact individual privacy. MN.AT2: Senior executives understand their roles and responsibilities in managing individual privacy. MN.AT3: Any interdisciplinary cross-functional internal privacy task force or working group, including members of legal, government relations, IT/IS, public relations/marketing communications and other relevant privacy personnel understand their roles and responsibilities as well as their impact on individual privacy. MN.AT4: Third parties (e.g. service providers, customers, partners) understand their roles and responsibilities. MN.MR1: Privacy risk is re-evaluated on an ongoing basis and as key factors, including the organization’s business environment (e.g., introduction of new technologies), governance (e.g., legal obligations, risk tolerance), data processing, and systems/products/services change. MN.MR2: Review and update privacy values, policies, and training to manage individual privacy risks. MN.MR3: Establish policies, processes, and procedures for assessing compliance with legal requirements to appropriately mitigate legal and compliance risks. MN.MR4: Privacy risk management process, procedures, and policies are established and progress is monitored, tracked, and communicated to organizational stakeholders. MN.MR5: Establish policies, processes, and procedures to receive, analyze, and respond to problematic data processing activities gathered from internal and external sources (e.g. internal discovery, privacy researchers, professional events). MN.MR6: Establish policies, processes, and procedures that incorporate lessons learned from privacy incidents and unintended data disclosures. MN.MR7: Establish policies, processes, and procedures for receiving, tracking, and responding to privacy related complaints or concerns from individuals and external entities MN.DP1: Identify, establish, assess and manage policies, processes, and procedures enabling data review, transfer, sharing or disclosure, alteration, and deletion for data processing ecosystem risk management. MN.DP2: Identify, establish, assess and manage data processing ecosystem parties (e.g. service providers, customers, partners, product manufacturers, application developers) using a privacy risk assessment process. MN.DP3: Implement appropriate and legally admissible measures to mitigate privacy related risks (e.g . written contracts and agreements) with data processing ecosystem parties. MN.DP4: Establish data retention policies, procedures, and processes to minimize privacy related risks and enable individuals’ data processing preferences and requests. MN.DP5: Build and maintain parental control mechanisms to mitigate online harm and help maintain minor's privacy. MN.DP6: Establish mechanisms to gather appropriate levels of consent from parents/guardians to mitigate privacy risks to minors. MN.DP7: Mechanisms for authorizing data processing (e.g., organizational decisions, individual consent), revoking authorizations, and maintaining authorizations are established and in place along with measures to permanently delete the data (i.e. right to be forgotten). MN.DP8: Establish and maintain the universal standards clearly identifying the permissions to augment, scan, and/or record (Universal DNC/DNA governing bodies to indicate public, private and restricted areas). MN.DP9: Provide individuals and organizations opt-in mechanisms before data collection and processing takes place. MN.DP10: Establish mechanisms in place to faciliate individuals' rights (Rights to be forgotten, rights to portability etc.) MN.SC1: Classify and define the data types that may be considered sensitive due to elevated privacy risks when collected , processed and/or stored. MN.SC2: Establish and maintain policies and processs to mitigate the privacy risks while collecting, storing and processings biometrically inferred data (e.g. data inferred from eye tracking, facial recognition, GAIT, gaze, pose, body movement, thought patterns). MN.SC3: When using biometrically inferred data for automated profiling consider risks to vulnerable populations, minors, underrepresented communities, and establish mechanisms to mitigate privacy risks via limiting the identification of individuals (e.g. on device processing, de-identification privacy techniques, tokenization, obfuscation, aggregation for storage or processing). MN.SC4: Establish and maintain privacy by design and privacy by default principles while collecting, processing or storing biometrically inferred data (e.g. consider decentralized or distributed architectures). MN.SC5: Implement Defense in Depth and utilize data protection mechanisms (e.g. encryption, psudonymization, data actions take place on local devices, privacy-preserving cryptography) while collecting, processing and/or storing biometrically inferred data. MN.SC6: Establish measures (e.g. takedown request processing) to address privacy and safety risks related to user generated content (UGC) such as content moderation for political use of data, misinformation, deepfakes, cyberbullying, hate speech, malicious bots, sex trafficing, terrorism and other nefarious online behavior. PR.DP1: Create and maintain a baseline configuration of information technology incorporating security principles (e.g. concept of least functionality). PR.DP2: Establish and implement change control and management processes. PR.DP3: Conduct, maintain and test backups of information. PR.DP4: Policy, regulations, compliance and contractual requirements are taken into consideration while putting data protection in place. PR.DP5: Regularly test effectiveness of data protection technologies and controls. PR.DP6: Response, recovery and vulnerability management plans are established, managed and exercised routinely. For example incident recovery, disaster recovery, incident response and business continuity plans. PR.DP7: Establish a Data Protection Officer (DPO) and coordinate with data protection agencies and other privacy watchdog entities, as required by law. PR.AC1: Identities and credentials are issued, managed, verified, revoked, and audited for authorized individuals, processes, and devices. PR.AC2: Physical safety considerations are taken into account and individuals are made aware of the shared responsibilities. PR.AC3: Remote access is limited and managed to adhere to the least privilege principle. PR.AC4: Access permissions and authorizations are managed, incorporating the principles of least privilege and separation of duties. PR.AC5: Access permissions and authorizations are managed, incorporating the mechanism to legally validate consent and data usage of the deceased. PR.AC6: Network integrity is protected (e.g. network segregation, network segmentation). PR.AC7: Privacy risks related to minors's age verification and consent are mitigated. PR.AC8: Integrity checking mechanisms are in place for digital goods such as ownership of avatars, and validation of true identity to prevent digital assets and identity theft. PR.DS1: Data-at-rest are encrypted using standard, state of the art, strong and reliable, trusted and effective encryption schemes. PR.DS2: Data-in-transit are encrypted using standard, state of the art, strong and reliable, trusted and effective encryption scheme. PR.DS3: Systems, products, services, devices, and associated data are formally managed throughout removal, transfer, and disposition. PR.DS4: Availability and capacity requirements are established and maintained internally and with third parties via legal contracts and SLAs. PR.DS5: Implement defense in depth data protection mechanisms (e.g. encryption, pseudonymization, local device data processing, privacy-preserving cryptography) and data minimization controls. PR.DS6: Integrity checking mechanisms are used to verify software, hardware, firmware, and information integrity. PR.DS7: The development and testing environment(s) are separate from the production environment. PR.PH1: Organizations should adopt policies that provide detail into how inappropriate, unlawful, or harassment related content is handled. PR.PH2: Organizations should maintain a complaint management system that processes reports and notifies impacted individuals within a reasonable time frame and allows users to appeal decisions. PR.PH3: Organizations should consider automated tools that can filter/block clearly identified content and permit geofences where XR content can be privately/closely managed. PR.PH4: Establish and maintain mechanisms to Inform or indicate to individuals anytime the XR or spatial computing environment has been altered from its original state. PR.PH5: Establish Trust and Safety mechanisms to mitigate psychological or physical harm (e.g. safety bubble, kick, block, isolate violators , isolate self to ensure safety, etc.).
Submission *