When binary systems collide with the public manifestation of fluid identities, there are risks and realities of misrepresentation. An alarming consequence of this tragedy is that segments of the population may be rendered invisible or ambiguous in the eyes of AI-Assemblages.

This erasure from view has far-reaching consequences concerning access to housing, employment, education, and healthcare. Moreover, many persons in invisible population segments can experience extreme hardship in advocating for their rights and liberties.

Erasure is the act of minimizing the other. In “In the Castle of My Skin”, George Lamming presents the plantocracy as a psychic and social apparatus. Power in plantation societies implies a process of social negotiation. AI is humans, so the risks of negation are actual, even past the plantocracy.

And because power is unequally distributed in social groups, some parties to the process are more present than others. This is where erasure is loosely coupled to identity in the history and politics of colonization and decolonization in Latin America and the Caribbean (LAC).

This negation calls for regulators and designers of AI-Assemblages to face these machines with a hermeneutics of suspicion. Identity is a compilation of triumphs and tribulations. It is subject to specific categories in a given historical and cultural period.

Identities are the results of the forces that animate them. When faced with an existential conflict against another that may result in erasure, moral commitment to rituals of resistance becomes most consequential.

A central theme in the paintings and poetry across LAC is the unrelenting search for identity. Using a range of important paintings and poems from Europe and LAC, one can discern the antagonisms between the cultural heritage of places where the vessels are whole and timeworn, and those societies made from the painful gathering of ill-fitted fragments stolen from ancestral shrines.

Africa has not only been disposed of by its many peoples but also of its historical consciousness according to the Cuban painter, Wifredo Lam. The art of Lam was to relocate black cultural objects in their world and landscape. Painting became an act of decolonization. Not in the physical sense. But in a mental one.

Just as AI propagates the rambling differences in the ascribed heterosexual matrix, that reproduce stereotypical relationships of power, it also serves to nurture intricate troubles for people who do not identify with any of those gendered relationships.

The social contours of the mindscape of (LAC) cannot be fixed by artificial intelligence (AI). AI-Assemblages mirror an unfair society and risk cementing and worsening prejudice, stereotypes, inherited inequality, and intergenerational immobility from beneath a veil of machine objectivity.

Algorithms do not make societies unequal or unfair. AI is also both genderless and sexless. It was once the case that the standard settings of artificially intelligent voices were female by default. Discrimination by default is now hastily retreating.

These voice agents were once programmed with humanlike traits of gender and personality, and designed to enact a specific social role – that of an assistant – in the functions they perform and in the messages they transmit.

The pervasive use of female voices prompted a clerical discourse – that of an assistant. Such gendered constructions sustain the view that women should be available on call, and be of service.

This design defect fell prey to age-old stereotypes of typists and housewives. The AI architects overlooked the lives of Ho Ching, Joan of Arc, Harriet Tubman, Ruth Bader Ginsburg, and Vera Wang.

Gender-related design choices channel user perception and imagination of users in the direction of stereotypes and antiquated notions of the gendered and sexual division of work by highlighting certain tasks as feminine.

LGBTQ+ people are often minimised by traditional systems of understanding and organization. This makes it important to critically monitor the role which AI systems may have in sustaining or entrenching marginalisation in this area.

Another area where the use of AI-Assemblage poses problems for people with queer identities is automated gender recognition (AGC). This is where gender is inferred from data gathered from textual and visual communication of individuals, and legal names. This may involve facial recognition systems that gauge the bones of jawlines and cheekbones or whether a person is wearing makeup. This can include the entire physical figure where body scanners are used in airports.

Applicants for social services who are transgender, non-binary, and intersex may inadvertently encounter minimization due to biased training data. These population segments may also experience problems during employment screening processes and are at risk when credit assessments at financial institutions are conducted using automated systems.

In some countries in Europe, the number of people holding a gender recognition certificate is relatively low compared to the total number of those whose gender identity differs from the sex they were assigned at the time that the birth was registered.

This contradiction between the life worlds and lived experiences of transgender, non-binary, and intersex individuals and their recorded data creates a risk of minimisation when interacting with automated tools in banking, talent attraction, athletics, and other sectors.

The predisposed binary decision-making models are worsened by the “explainability” problem, whereby we sometimes struggle to determine how AI-Assemblages arrive at a specific outcome.