Your cart is currently empty!
Do AI Data be controlled for IK Preservation?
Data sovereignty refers to the idea that Indigenous peoples should have control over data that is collected about them, their culture, and their knowledge. This is a key ethical consideration in the use of AI for IK preservation. AI systems should be designed in a way that respects this sovereignty and does not exploit or misuse Indigenous data. This includes ensuring that data is stored securely, that it is used for the purposes for which it was collected, and that it is not used to harm Indigenous communities (ICT Inc, 2022).
Data sovereignty is a critical aspect of Indigenous rights and self-determination. It refers to the principle that data about Indigenous communities, including their knowledge and cultural heritage, should be controlled by those communities. This means that Indigenous peoples should have the right to decide what data is collected, how it is used, and who has access to it. AI systems rely on large amounts of data to function effectively. This data, when it pertains to Indigenous communities, can include sensitive cultural, historical, and personal information. Misuse of this data could lead to cultural appropriation, misrepresentation, or other forms of harm. Therefore, it is crucial that AI systems are designed and used in a way that respects Indigenous data sovereignty (Kukutai & Taylor, 2016).
Achieving data sovereignty in AI for IK preservation involves several steps. First, Indigenous communities must be involved in decision-making processes about data collection and use. This includes deciding what data is collected, how it is collected, and who has access to it. Second, data must be stored securely to prevent unauthorized access or misuse. This could involve using secure servers or other forms of encryption. Third, data use must be transparent. Indigenous communities should be able to see how their data is being used and for what purposes. Finally, there must be mechanisms in place for Indigenous communities to revoke access to their data or request its deletion (Kukutai & Taylor, 2016). By respecting Indigenous data sovereignty, we can ensure that AI is used in a way that benefits Indigenous communities, rather than causing harm.
Cultural Sensitivity
Artificial Intelligence (AI) systems used for Indigenous Knowledge (IK) preservation should be designed with cultural sensitivity. This means that they should not stereotype or misrepresent Indigenous cultures. This requires careful design and testing of AI systems. It also requires ongoing consultation with Indigenous communities to ensure that the AI system is culturally appropriate and respectful (ICT Inc, 2022).
A study conducted in Australia’s World Heritage-listed Kakadu National Park demonstrated the importance of cultural sensitivity in AI systems. The study involved the development of knowledge coproduction mechanisms that combined Indigenous knowledge, AI, and technical sources to monitor the health of a culturally significant wetland. The mechanisms developed provided a practical and ethical means of empowering different sources of knowledge for adaptive decision making while respecting and protecting differences in how knowledge is generated, interpreted, and applied (Ecology and Society, 2022).
Another study emphasized the importance of indigenous knowledge stewardship in seed banks. It outlined a theoretical framework for improving strategy and practice in seed bank institutions to ensure stronger protection of indigenous knowledge through stewardship. This is based on institutional adoption of cultural wellbeing and risk management in the context of international legal standards for consent and use of plant resources; and to the transfer of indigenous knowledge with access and benefit sharing arrangements in place (International Journal of Rural Law and Policy, 2015).
If AI systems are fed with wrong or misleading information, they can produce outputs that are harmful or offensive to Indigenous communities. For instance, if an AI system is trained on data that includes stereotypes or misrepresentations of Indigenous cultures, it may perpetuate these stereotypes in its outputs. This could lead to harm by reinforcing harmful stereotypes or by misrepresenting the cultures and knowledge of Indigenous communities. Therefore, it is crucial that the data used to train AI systems is accurate, respectful, and representative of the cultures and knowledge it is intended to preserve and disseminate.
Download Full Paper here : Balancing Technology and Tradition
Leave a Reply