The THEIA-XR project is dedicated to creating technologies that help workers gather more important information and navigate the challenges of e.g., bad weather conditions or difficult operating landscapes.

However, while implementing these technologies, we need to ensure that we are not compromising the privacy of workers. Moreover, we need to take into account the ethical and value sensitive aspects of work:

What if, while giving the operator new information they can rely on, we are making them overdependent on this information?

What is the data stored for the operator to reflect on their performance is also used for performance evaluation without the operator's consent?

To tackle these and other questions that have arisen during our work with end-users and other stakeholders, we have started to create guidelines for ethical and privacy preserving XR implementation in the Off-Highway Machinery scenario.

We are constantly updating these guidelines, and we need your input on the current version. You do not need to be an expert on the specific XR implementation in the industry; we are trying to gather diverse feedback and see the position across the XR domain to find the right way to proceed to more detailed and focused recommendations.

Feedback on our guidelines mentioned below is more as welcome and can be provided through the following survey.

Explanation:

User agency and control over their actions (including privacy-related decisions) in XR are keystones to ensuring ethical soundness and an appropriate level of privacy protection within the systems (Abraham et al. 2024). In the case of XR enhancements in off-highway machinery, the end user is usually already an expert in work-related machine operation and has a proper mental model of the process. Adding new elements for the experienced operator to interact with, with no possibility of opting out, may disrupt their workflow and create reluctance and resistance to use the innovations. we advocate for a high level of possible customization for the proposed XR features. We also stress that if the development team identifies certain features that cannot be switched off for various reasons, they should discuss these features in detail with the end users to ensure they will not cause disruption across different modalities of use. Specifically for privacy-related features, we suggest applying the tailored privacy-settings control to ensure the user can understand the consequences of switching certain features on and off and make an informed decision about the preferred level of data sharing.

Our Recommendations:

  • Implement granular controls for feature activation and data collection and sharing (from the user side);
  • In the early stages of the project, decide which features (elements) will have the option to opt-in or opt-out;
  • Provide an explanation for the functionalities from which one cannot opt-out;
  • Ensure that the end-user can easily turn features on or off, including those that collect personal data;
  • Ensure that the end-user is properly informed about the positive and negative effects of activating or deactivating features;
  • Incorporate visual and audio cues to alert users when data collection is active, ensuring they are always aware of their data privacy status

Explanation:

The expert and internal research project stakeholders insisted that in the current state-of-the-art industries, the machines hardly collect any significant amount of personal data and that data is usually stored inside the machines. However, with the further development of technologies such as teleoperation, vehicle fleet management, and performance optimization, the need to collect and store more data will rise. It could include data about the operator’s actions and the surroundings (including the possibility of collecting data on bystanders). This means that an important part of developing XR enhancement for the future of off-highway machinery will be understanding the requirements for more extensive data collection and applying them following the privacy-by-design paradigm (Art. 25 GDPR) to ensure that the collection is privacy-preserving and ethical.

Our Recommendations:

  • Determine high-level purposes for the data collection (e.g., increase efficiency, develop a more fulfilling working experience) and verify them with relevant stakeholders and end-users;
  • Specify where the collected data will be stored (locally on the machine or in the cloud), ho long it will be stored, and for which purposes it will be used now and in the future;
  • Minimize data collection to only what is necessary for XR functionality and safety (prioritize data minimization);
  • If the data collection for a system's training includes personal data, the end-user (operator) should be informed about it.

Explanation:

Auditability leads transparency and accountability of XR systems and, therefore, improves the ethical and privacy dimensions of the development. Effective auditability supports stakeholders' understanding, scrutiny, and review, which is important given the integration of XR technologies with their physical (operational) environments and existing workflows.

Auditability is extremely important in multi-stakeholder development and integration, as in the case of the off-highway machinery XR implementations. It helps create mechanisms that enable the reconstruction or unpacking of a system's data outputs and inputs, detailing how they were used and their operational context. In the off-highway machinery domain, we mostly focus on auditability in the context of incident investigation and analysis. The stakeholders agree that it is necessary to have a procedure that can handle extreme cases of operation (e.g., operation in difficult weather conditions) and/or incidents. When the vehicle/operator identifies the situation as extreme, the vehicle should be able to collect more data than during standard operation mode and save them for further analysis. At the same time, this type of data collection and handling should be covered by the developed privacy handling procedure and clearly communicated to the end-user.

Our Recommendations:

  • Implement mechanisms for stakeholders and auditors to verify data collection practices and usage of XR implementations;
  • Specify scenarios and use cases in which the audit happens (e.g., systematic underperformance of the vehicle, incidents, predefined operation conditions (e.g., operation in zero visibility);
  • Designate specific data points that are critical for auditing purposes (e.g., switching in the XR features in the operation cycle);
  • Develop a protocol for internal audit of the data collection and practices and define the role of the internal auditor who can check which specific data are needed for the audit purpose;
  • Engaging a range of stakeholders/developers in the repeated audit process (on the stage of design and deployment of the XR implementation to ensure that the audit data serves the interests of all parties involved (e.g., can be used by the operating company, vehicle developers, and external investigators (in case of incidents));
  • End-users should have clear and accessible information about what data is being audited and how it is used (e.g., in case of incident holding);
  • End-users should have mechanisms to review or request deletion of their audit data after the audit, if applicable (e.g., if the system collects personal data).

Explanation:

A serious challenge with implementing XR features into the working processes of off-highway machinery is reaching the right level of trust in the system.

On the one hand, experienced operators may find the new features distracting, diverting their attention from the real-world environment (and so they handle these features with suspicion and reluctance). On the other hand, junior operators exposed to these features from the very beginning might develop a habit of relying on them excessively, which can create different types of risks (e.g., mixing real and virtual objects and lowering awareness of real-world objects) (Abraham et. al, 2023). To mitigate this problem, we need to ensure that the implemented XR-enhancement system allows the operator to customize its features, explain how the features work and provide clear cues about borders of trustworthiness to prevent overreliance.

Our Recommendations:

  • Implement mechanisms for stakeholders and auditors to verify data collection practices and usage of XR implementations;
  • Conduct an assessment of each feature to identify any aspects that might lead operators to misinterpret or overly depend on virtual information;
  • Develop the interface, which can be adjusted to the operator's experience level and incorporate customizable features to allow operators to adjust the XR features to their preferences;
  • For features that cannot be turned off or adjusted, the operator should receive a clear explanation of why the feature cannot be disabled;
  • Implement training programs that focus on explaining the boundaries of the technologies, their potential pitfalls, and the optimal way of using virtual information;
  • Assure that the end users will be able to efficiently control the process of operation and take the decision independently.

Explanation:

As shown in previous works, capturing bystanders’ information creates a risk to their privacy rights due to the potential for connecting the data with external datasets.

To mitigate these risks, XR implementations should be designed to minimize the storage of bystanders’ data and opt for technologies that collect less personally identifiable information. Additionally, adopting privacy-preserving techniques can help protect bystanders’ identities even if their data is incidentally collected. It is also necessary to inform bystanders about possible data collection in the area.

Our Recommendations:

  • Prioritize methods of information collection that are less privacy-invasive (e.g., using thermal cameras instead of video capture for human recognition in the workplace);
  • Design multiple mechanisms to protect the identity of bystanders, including anonymization techniques and appropriate cryptographic schemes for handling data, which can have potentially private information (e.g., obstacle detection);
  • Developed methods to notify bystanders about data collection; we suggest doubling the warnings to ensure they are not missed (e.g., in the case of off-highway machinery, we can place warnings about the collection of surrounding information both on the vehicle and near the entrances of the operation area);
  • If, for any reason, data anonymisation is not possible, make a process for bystanders to request the deletion of their data.

Explanation:

In the domain of XR solutions for off-highway machinery, AI can be used for developing and updating algorithms for collision avoidance, optimizing user control and feedback during teleoperation, and providing real-time suggestions for performance optimization by guiding the use of visual and XR cues.

However, it is necessary to tailor these solutions to the domain of off-highway machinery using data from machine and operator performance. In a broader sense, the development of any AI solutions for XR-related implementations in the off-highway vehicles domain should also follow the general guidelines for ethical AI development, including addressing the questions of transparency, reliability, and justice, among others.

Our Recommendations:

  • Any algorithm implemented in XR solutions should be tested for fairness (e.g., in the case of humans on operation site detection, it should equally recognize people with different skin colours);
  • Prioritize the use of transparent and explainable algorithms whenever possible;
  • Ensure that the data used to fine-tune the algorithms for the domain of off-highway machinery does not collect privacy-sensitive data;
  • If the algorithms are fine-tuned on data from real operations, inform the operator of the use of the data for that purpose.
scroll to top