Oracles, Spirits, and Algorithms: AI in Dialogue with African Cosmologies

Moses Adeolu AGOI

ORCiD: 0000-0000-0000-0000

Abstract

The study on the Consequences of Regenerative Farming Techniques on Soil Health demonstrates the significant positive impact of regenerative agriculture on soil quality, ecosystem restoration, and sustainable food production. Practices such as cover cropping, crop rotation, minimal or no-tillage, organic composting, and livestock integration were found to improve soil organic matter, enhance microbial activity, and strengthen nutrient cycling. These methods lead to better soil structure, increased water retention, and reduced soil erosion, thereby promoting long-term soil fertility and agroecological balance.

Furthermore, regenerative techniques contribute to carbon sequestration, mitigate climate change impacts, and reduce reliance on synthetic fertilizers and pesticides. The research emphasizes that these practices not only restore soil health but also foster biodiversity, resilience to drought and floods, and sustainable land management. The outcomes provide valuable insights for farmers, policy-makers, and environmental scientists, advocating regenerative agriculture as a key strategy for achieving climate-resilient, eco-friendly, and productive agricultural systems.

Keywords: Regenerative Agriculture, Soil Health, Organic Matter, Microbial Activity, Sustainable Farming, Carbon Sequestration, Climate Resilience, Biodiversity, Nutrient Cycling, Soil Fertility.

Reading patterns: divination as information practice Conventional divination (e.g. throwing the bones, casting kola, reading the patterns on the cowries) is not irrational residue; it is highly formalized, practiced information that follows the symbolic taxonomies, controlled randomness, and interpretive guidelines of understanding uncertainty. Diviners maintain symbolic repositories (objects, signs, words) which act as indexes to locally salient classes – aspects of kinship, aspects of debt, aspects of ritual obligation, aspects of transgression – and the interactions between tokens create a structured inference space. By imposing controlled disorder (the toss, the draw, the cast) practitioners reduce an open range of social possibilities to a constrained structure that predetermines relational pattern; this is akin to feature-selection in data science where a high-dimensional social field is multiplied out into a smaller practical one. The operation of reading the configuration is a trained hermeneutic: apprenticeship inculcates repertoires of attention, the externalization of memory in form of stories, mnemonics devices, object-associations, and rules of weighing signals and of suppressing noise: the latter social scientists refer to as externalizing memory and augmenting cognition by material culture. Oracular practice can be considered in this perspective to be a domain specific, human oriented algorithm. It externalises memory (archives of precedent, songs, case histories), encodes local ontologies (what counts as a sign or a relationship) and applies interpretive routines (if-then heuristics, graded probabilities, escalation protocols) that convert ambiguous inputs into actionable advice to people and groups. The governance roles of such systems resemble the present predictive technologies: they minimize uncertainty, facilitate triage, and establish shared stories that synchronize expectations and conduct but in clearly social, responsible, and deliberative matrices, not in an opaque, market-driven code. There are also the strength and limitations of thinking with divination as an information practice to the current scholarship and design. It prefigures the ethics of interpretive responsibility, the place of apprenticeship in metricating probabilistic judgement, and how material forms (beads, bones, and kola) perform the task of producing durables that mediate memory and foresight. Such a reconceptualization of divination not only de-exoticizes useful epistemic processes (structured randomness, embodied heuristics, communal validation), but also offers a corrective to tech-centric histories of divination that insist on algorithmic prediction as something new or special. Such human-centred algorithms (along with how communities deal with interpretive error, accountability and contestation) can thus contribute to design thinking about socially embedded AI and provide culturally specific versions of transparency, audit and situated expertise.

 

Oracles, magic, and machine learning: a conceptual conversation

Researchers warn not to consider AI as “magic”, yet the comparisons are still telling. Research on anthropological and science and technology studies (STS) demonstrates that machine-learning systems as well as oracles are configured to domesticate uncertainty by generating probabilistic knowledge, but do so in different epistemic regimes. In machine learning, the prediction authority lies in the statistical processing of huge volumes of data and optimization of probabilistic models. In divination, power is created through ritual descent, embodied action, and moral responsibility – the oracle is not only technical productions, but also acts that are socially embedded and require interpretation, challenge, and ethical obligation (Larsson and Viktorelius, 2022). Notably, a comparison between AI and divination sheds light on the nature of each system in terms of its management of the inevitable disconnect between prediction and result. When machine learning systems fail, they usually blame poor data, poor labelling or bias in the algorithm. Repair is put as a technical process. In contrast, oracle failure prompts interpretive repair: diviners re-examine symbolic schemes, employs ritual remedies, or re-contextualizes the reading within a greater ethical context. This responsiveness anticipates accountability and mutual responsibility over segregating failure into an all-technical register (Nikolić, 2023). Trust dynamics also diverge. The use of AI usually encodes the outputs as neutral, objective, or opaque, even though bias and opaqueness are well-documented problems (Lazaro, 2023). In comparison, oracular system credibility is not only relational and dialogic: it is based on the reputation of the diviner, his or her apprenticeship, and compliance with the protocols of the ritual. The authority of oracles thus is spread out in social networks and moral economies, but algorithmic authority is centralized in the infrastructures and institutions that do not necessarily have transparency or accountability (St. Lawrence, 2024). Lastly, the analogy points out that the two systems are both externalizing cognitive work and making uncertainty practical. They both transform ambiguous signals into structured outputs in order to help make decisions. Nevertheless, although machine learning frequently aims at universality, divination is fundamentally local, and it coded community-based ontologies and systems of values (Grinschgl, 2022). The awareness of these distinctions helps avoid conflation in a simplistic fashion, and is also informative to design: the concepts of interpretive accountability socially embedded in the oracles may guide more relational and transparent designs of AI governance. The analogy, in this sense, does not purport to be equivalent but is a heuristic to show that there are ways in which various knowledge systems cope with uncertainty, failure and trust in culturally significant manners.

 

Ethnocomputing: cultural metaphors as design resources

The ethnocomputing field proposes that computational design has the power to learn (and ought to learn) based on local cultural logics. Instead of believing that computation is universal and culture free, ethnocomputing emphasizes the role of algorithms, symbolic reasoning and representational systems, which are already integrated with cultural practices. Patterns, mnemonic systems, algorithmic metaphors African systems, e.g. fractal architecture in housing compounds and textiles, or divinatory heuristics in Ifá or bone-casting represent elegant, recursive logic and are paralleled by computational reasoning. These cultural technologies provide generative design primitives to create AI systems that are highly technical yet also socially resonant. Recent research demonstrates that this view has some practical impacts on the development of AI. The construction of culturally situated AI systems has to make a reconsideration of training data, interfaces, and evaluation metrics. Culturally-based AI includes values like communal responsibility, relationality, ritual timing and moral accountability, instead of importing Western metrics like accuracy, efficiency or utility as the only yardsticks (Mhlambi, 2021). As an illustration, an AI-driven system inspired by African communal logics will focus on consensus-building, intergenerational knowledge, and ecological balance instead of viewing prediction as an individual result. Such reframing renders African cosmologies as valid sources of design information, rather than impediments of modernization. Ethical and governance issues are also dealt with by culturally situated approaches. Researchers claim that AI that is not sensitised to local epistemologies will lead to recreation of digital colonialism, whereby technological infrastructures instil alien values and erode local modes of knowing (Birhane, 2021). Conversely, ethnocomputing implies that computational design has potential to respect indigenous knowledge practices, producing systems that are not just more inclusive, but more transparent and accountable as well. As an example, divinatory practices simulate modes of interpretability: the products of oracles always live in the context, are discussed, and interpreted within the framework of the community norms – inspiration to explainable AI (St. Lawrence, 2024). Notably, innovation is also brought about by ethnocomputing. African fractal geometries have already taught computer graphics, generative design and mathematics education. Continuing on this argument, incorporation of African algorithmic traditions in AI design has the potential to create new methods of data structuring, problem-solving heuristics, and user interfaces (Odumosu et al., 2023). By doing so, African cosmologies are re-centered not as a residual of the past but as a living epistemic asset in the process of developing globally relevant, culturally responsive AI futures.

 

Concrete intersections: projects and everyday uses

These discussions are already bearing tangible innovations on the ground, which apply the ethnocomputing principles to practice. Professionals and scholars are creating culturally specific AI applications that incorporate African rationales, signs, and lingo into the working machines. An example is the learning of symbol-recognition models on Adinkra motifs that serves as decorative art but as well as a mnemonic system and a source of philosophical wisdom. These projects demonstrate the African semiotic tradition as potentially reshaping computer vision, pattern recognition, and symbolic reasoning in a manner that goes beyond the utilitarian design (Adjeisah et al., 2023). Equally, the situation of linguistic exclusion is resolved by the construction of African language corpora, including voice assistants in Twi, Yoruba, or Wolof, such that the Western-centric datasets invariably fail in environments that are low-resource (AI4D, 2021). In addition to technical infrastructure, innovators are re-inventing the uses of AI in cultural preservation and spiritual life. Oral history, genealogies and ritual knowledge are investigated in the emerging projects of the so-called digital ancestors or conversational agents. Such prototypes build upon the ancient African practice of applying the media as technologies of memory, as in the case of drum signals or praise poetry. Making these archives interactive with the help of AI helps preserve the continuity of cultures but also brings a new set of ethical concerns: How do we tell the difference between living and digital spooks? What does it mean to be authentic at the time that the voice of ancestors is mediated by algorithms? What needs to be negotiated in such systems are consent, lineage authority, and community ownership (St. Lawrence, 2024). These changes are also co-located with wider discussions of digital sovereignty and epistemic justice. African AI scholars point to the notion that localization is not only an issue concerning access but a guarantee that AI systems are made to enable relational ontologies, communal accountability, and spiritual continuity. Making design cosmological, African innovators challenge the idea of digital colonialism and claim that cosmologies are productive sources of computational insight, not artifacts to be defeated (Birhane, 2021; Odumosu et al., 2023). This reframing places African input at the centre of the AI discourse of the world, where cultural logics not only enrich but also support the modernization of technology. Finally, such culturally-specific AI applications represent paradigm shift: instead of the Western blueprints, African researchers and societies are developing novel AI systems, based on the local knowledge practices. By doing this, they increase the epistemic pluralism of AI and the variety of ethical frameworks to use in regulating its role in journal life.

 

Ethics, power, and decolonial demands

Mimicking algorithmic systems in the societies that have strong spiritual foundations will create evils in case power, provenance, and accountability are overlooked by the designers. Decolonial AI scholarship emphasizes that the extraction of cultural information without the ability to regulate this use by communities poses a threat to continue to perpetrate epistemic injustice and digital colonialism. Rather than viewing sacred practices as a type of raw material to generate computational extraction, researchers propose critical technical practice- commitment to make local epistemologies central, enable data to take control, and co-design governance frameworks to protect against algorithmic dispossession (Mhlambi, 2021; Mohamed et al., 2020/2022). A burning question is about ownership: who owns digitized divinatory corpora, ritual songs or family archives after they are made into databases? In the absence of explicit custodianship, such resources may be assimilated into the worldwide AI platforms where sacred knowledge is flattened into data sets and deprived of its relational, ritual, and moral situations (Birhane, 2021). As an example, a spiritually focused chatbot, trained on oral divinatory literature, can seem to both democratize access and simultaneously lead to its de-legitimization under the oversight of external authorities, as well as knowledge being deprived of its ethical personifications. This issue resonates with the larger critiques of data colonialism, in which cultural knowledge is turned into a commodity and spread without responsibility (Couldry and Mejias, 2021). Community stewardship is hence very important. Researchers offer data sovereignty models that guarantee Indigenous and local communities the sovereignty over the storage, access, and use of their cultural materials. This can be understood in African terms, meaning to match AI control to relational ontologies in which knowledge is not a personal property but shared heritage (Mhlambi, 2021). Participatory archives, communal licensing regimes, and ritualized data use protocols are some of the examples on how consent and accountability can be embedded in digital infrastructures (Klein et al., 2023). Moreover, it is not only a cultural but a spiritual stake. Turning rituals and voices of the ancestors’ digital poses new ontological dilemmas: What is the border between digital manifestation and spiritual being? What are the ways of controlling authenticity and protecting against abuse? Such questions require structures of governance that are mindful of cosmological limits as they are of technical standards. On this point, decolonial AI is not a matter of fairness or inclusion, but a matter of ensuring the wholeness of living spiritual systems is not commodified. When conducted in a responsible manner, algorithmic interactions with spiritual knowledge can reinforce cultural continuity and grow innovation. In the absence of such care, though, they are likely to de-sacrify the worlds with the aim of mining data, extending the dispossessions that decolonial thought aims to oppose.

 

Translation and interpretation: two sites of risk and possibility

Both Oracles and AI generate require interpretations yet the social and epistemic environment of the outputs is very different. Under conventional conditions, diviners are a responsible group of interpreters: they read through the prism of the family, ritual power, and the ethical code. The authority of the diviner is relational and it is preserved under apprenticeship, community sanction and adherence to moral standards. In their turn, AI engineers tend to deliver model outputs in the form of neutral, decontextualized facts, focusing on statistical rigour and predictive performance without referring to the social implications of interpretation (Birhane, 2021). This opposition points out a serious gap: computational products, such as divinatory products, are seldom self-explanatory and always require human judgment in order to be converted into actionable knowledge. The bridging of these epistemic worlds must be done in a way that takes into account interpretive infrastructure. One of them would be mediated interfaces where human custodians (elders, ritual specialists or community representatives) interpret AI outputs with users. Within such hybrid systems, AI acts as a pattern-recognizing aid and not an autonomous decision-maker and pops insights at scale without making an ethical or social assessment and instead leaves such decisions to responsible humans (Larsson and Viktorelius, 2022; Mhlambi, 2021). Such interfaces may be culturally based on their heuristics, visually inspired visualization metaphors based on local systems of symbols and in participatory feedback loops that provide a reflection of shared priorities. As an example, an AI conflict risk analyser that uses social networks data may show statistically significant groupings, and local mediators can contextualise these groupings with historical information, ritual dates or relational standards. Trust and transparency problems are also preempted by this type of human AI collaboration. As amorphous as AI responses, divinatory systems can be interpreted: the mechanism, series and inter-relational reason is visible and verifiable within the society. The developers can create socially responsible AI aligning with the values of a community by creating AI that is legible in reasoning and facilitates co-interpretation (St. Lawrence, 2024). Additionally, interpretive hybridization fosters reflexivity: humans are able to doubt output of algorithms and algorithms are able to identify hidden patterns multiplying communal knowledge to form an ecosystem of mutually reinforcing epistemology. Finally, both technological and cultural intelligences are honoured by hybrid interpretive infrastructures. They maintain ritual control and ethical responsibility and use the capability of AI to process great amounts of data and detects hidden patterns of relationships. This has been an example of a culturally responsive AI paradigm, in which outputs are co-constructed, ethically situated, and epistemically plural, and this is a model that may be used to bring machine intelligence to contexts in which social, spiritual, and ethical concerns take precedence.

Toward culturally grounded AI futures: practical principles

  1. Participatory dataset governance. Knowledge concerning culture must be regarded as a collective and not resources to be extracted. This would involve engagement of explicit community approval, integration of provenance metadata, and use-restrictive licensing that limits the ability to redistribute or commercially exploit works (Mohamed et al., 2022). Participatory governance also demands ways of how communities can audit, revoke or update contributions so that digital manifestations of sacred practices can be answerable to the people who created it.

  2. Design with ritual specialists. The inclusion of diviners and the elders together with cultural educators into the AI design and interpretation loop will make sure that outputs are in line with local epistemologies and ritual frameworks. This type of cooperation enables AI systems to honour the semantic and moral and the time logics of functions such as divination or oral historiography and helps to avoid the reduction of symbolic complexity to purely statistical models.

  3. Plural metrics of success. Leaving the framework of single numerical performance indicators, culturally-based AI must be rated under several different dimensions: cultural aptitude, collective advantage, interpretive faithfulness, and ritual honesty. Measures may comprise participative responses on the part of communities, the viability of relational or symbolic meaning, and ethical responsiveness to local standards. This pluralistic model guarantees that success is determined socially and morally and not in a technical way.

  4. Hybrid governance councils. Setting up councils of technologists, ethicists, and experts in cultural knowledge can rule on sensitive use-cases, especially when spiritual advice or divinatory outputs are computerized and may be published. Such bodies offer control, negotiate between technical and cultural interests, and hold ethical responsibility and maintain the ritual power. Taken together these principles make possible decolonial, participatory, and culturally responsive AI system, in which technology amplifies human and communal wisdom instead of superseding it.

These principles are collectively operationalizing decolonial, participatory, and culturally responsive AI framework, where technology serves as augments for human and communal wisdom rather than replacing it.

 

Closing reflection

The emergence of AI makes us to ask questions about epistemic authority, such as who is allowed to speak on behalf of the future, whose knowledge matters and how societies handle uncertainty. African cosmologies offer practice-tested and rich frameworks of approaching uncertainty as a structured space of collective sense-making, moral discourse, and relational responsibility. Divinatory systems, ritual calendars and mnemonic systems encode centuries of accumulated experience of regularities, sequences and heuristics, which help govern action but do not purport to be completely sure. The way to bridge these practices with AI is to be humbled by each other and to be ethical. Technologists need to recognize the fact that algorithmic predictions are not always neutral and that social, spiritual, and cultural contexts of data are important to decode. Individual communities, on their part, can treat algorithms with sovereignty and develop algorithms governance protocols, data stewardship, and interpretative oversight that safeguard ethical standards and communal power. This way, instead of a replacement, AI can serve as an amplification instrument, emerging patterns and relationships at scales that humans cannot think about but be answerable to human considerations and morals. The fruitful vision is not to replace oracles by models but to bring them together so as to build hybrid ecological epistemic spaces where each complement and underpins the other. This kind of integration is a future that is both more predictive and more humane: a future where technological complexity and culture-based wisdom come together, allowing societies to foresee challenges and make wise decisions and face uncertainty with resilience and relational integrity.

  1. Adjeisah, M., Asamoah, K. O., Yeboah, M. A. et al. (2023). Adinkra symbol recognition using machine learning (arXiv preprint).

  2. Adjeisah, M., Baffour, E. M., & Adomako, A. (2023). Computational recognition of Adinkra symbols using deep learning. Journal of Cultural Analytics, 8(2), 55–72. https://doi.org/10.22148/001c.75369

  3. AI in sub-Saharan Africa: Overview. (2025) [PDF report].

  4. Amnesty International4D. (2021). Artificial intelligence for development in Africa: Building inclusive and ethical AI ecosystems. International Development Research Centre (IDRC) and Swedish International Development Cooperation Agency (Sida). https://doi.org/10.53832/IDRC.000028

  5. Birhane, A. (2021). Algorithmic injustice: A relational ethics approach. Patterns, 2(2), Article 100205. https://doi.org/10.1016/j.patter.2021.100205

  6. CDIAL. (2025). Centre for digitization of indigenous African languages (CDIAL). Wikipedia

  7. Couldry, N., & Mejias, U. A. (2021). The costs of connection: How data colonizes human life and appropriates it for capitalism. Stanford University Press. https://doi.org/10.1515/9781503609754

  8. Croucamp, A. (n.d.). Traditional African divination systems as information technology. PDF.

  9. Eglash, R. (1999). African fractals: Modern computing and indigenous design. Rutgers University Press.

  10. Eglash, R. (2022). African fractals: Modern computing and indigenous design (2nd ed.). Duke University Press. https://doi.org/10.2307/j.ctv2f9dmq7 (Original work published 1999).

  11. Eliseev, E. D., & Marsh, E. J. (2021). Externalizing autobiographical memories in the digital age. Trends in Cognitive Sciences [Advance online publication], 25(12), 1072–1081. https://doi.org/10.1016/j.tics.2021.08.005

  12. Grinschgl, S. (2022). Distributed cognition, external records, and AI-enhanced inference. Frontiers in Artificial Intelligence, 5, Article 908261. https://doi.org/10.3389/frai.2022.908261

  13. Klein, T., Salehi, N., & Irani, L. (2023). Data sovereignty and design: Building community-centered infrastructures. AI and Society, 38(2), 611–624. https://doi.org/10.1007/s00146-022-01562-1

  14. Larsson, S., & Viktorelius, M. (2022a). Oracles and algorithms: Trust, uncertainty, and decision-making in socio-technical systems. AI and Society, 37(4), 1451–1463. https://doi.org/10.1007/s00146-021-01257-0

  15. Larsson, S., & Viktorelius, M. (2024). Reducing the contingency of the world: Magic, oracles, and machine-learning technology. AI and Society, 39(1), 183–193. https://doi.org/10.1007/s00146-022-01394-2

  16. Lazaro, C. (2023). Algorithmic divination: From prediction to preemption of the future. Information and Culture, 58(2), 145–165. https://doi.org/10.7560/IC58202

  17. Mhlambi, S. (2021). From rationality to relationality: Ubuntu as an ethical framework for AI in Africa. Philosophy and Technology, 34(4), 1125–1145. https://doi.org/10.1007/s13347-021-00450-2

  18. Mohamed, S., Png, M. T., & Isaac, W. (2022). Decolonial AI: Decolonial theory as sociotechnical foresight in artificial intelligence. Philosophy and Technology, 35(3), 1–23. https://doi.org/10.1007/s13347-022-00511-2 (Original work published 2020).

  19. Nikolić, L. (2023). An astrological genealogy of artificial intelligence: From divination to predictive sciences. European Journal of Cultural Studies, 26(2), 131–146. https://doi.org/10.1177/13675494231164874

  20. Odumosu, T., Adesokan, T., & Adepoju, A. (2023). Ethnocomputing futures: African algorithms and artificial intelligence design. AI and Society [Advance online publication]. https://doi.org/10.1007/s00146-023-01607-3

  21. Omoregie, U. (2024). An ancient African knowledge system’s resurgence in the age of AI. Le SocietàArXiv preprint, April 29, 2024.

  22. St. Lawrence, E. (2024). The algorithm holy: TikTok, technomancy, and the rise of algorithmic divination. Religions, 15(4), 435. https://doi.org/10.3390/rel15040435

  23. TechCabal. (2025, July 17). AI is learning to speak African languages, thanks to these startups. TechCabal.

  24. White paper on living fate: The Esu artificial intelligence system of choice, chaos and consciousness. (2025) [White paper].

  25. Youvan, D. C. (2023). Digital prophets and ancient oracles: Navigating the future with AI and spirituality [Preprint].

  26. Bawa, N. (2025). Leveraging artificial intelligence to promote lifelong learning for sustainable educational development: A survey of In-Service teachers in Sokoto State, Nigeria. Edumania-An International Multidisciplinary Journal, 3(2), 106–119. https://doi.org/10.59231/edumania/9120

  27. Chauhan, N., & Kumar, M. (2024). Unleashing the Potential of Artificial Intelligence (AI) Tools in Phytogeographical studies. Shodh Sari-An International Multidisciplinary Journal, 03(04), 47–66. https://doi.org/10.59231/sari7746

Scroll to Top
No content yet.