Artificial intelligence is often presented as a neutral and forward-looking force that improves efficiency and removes human bias from decision-making. In practice, however, many women working in Indonesia’s gig economy experience these systems very differently. Rather than easing workloads, AI-driven platforms are intensifying existing pressures.
Recent research examining female gig workers introduces the concept of “AI colonialism.” This idea describes how older patterns of domination continue through digital systems. In this framework, powerful technology actors, largely based in wealthier regions, extract labour, data, and economic value from workers in developing countries, reinforcing unequal global relationships. The structure resembles historical colonial systems, but operates through algorithms and platforms instead of direct political control.
In Indonesia, platforms such as Gojek, Grab, Maxim, and Shopee rely heavily on informal workers. These companies have not transformed the nature of employment. Instead, they have digitised an already informal labour market. Workers are labelled as independent “partners,” which excludes them from basic protections such as minimum wages, paid sick leave, and maternity benefits. Earnings depend entirely on the number of completed tasks and algorithm-based performance scores.
For women, this structure intersects with what is often described as the “double burden,” where paid work must be balanced alongside unpaid domestic responsibilities. One delivery worker, Lia, begins her day before sunrise by preparing meals and organising her children’s routines. Only after completing these responsibilities can she log into the platform. As she explains, the system recognises only whether she is online, not the constraints shaping her availability.
Platform algorithms prioritise continuous, uninterrupted activity. Incentive systems often require completing a fixed number of orders within strict time windows. For workers managing caregiving roles, this creates structural disadvantages. Logging off to attend to family responsibilities can result in lost bonuses, while reducing work hours due to fatigue or health issues leads to declining performance metrics.
This reflects a greater economic reality in which unpaid domestic labour underpins the formal economy without recognition or compensation. Instead of addressing this imbalance, AI systems can intensify it. Another worker, Cinthia, observed a noticeable drop in job assignments after taking time off due to illness. The experience created a sense that the system penalises any interruption, making workers reluctant to pause even when necessary.
Although algorithms do not explicitly target women, they are designed around an ideal worker who is always available and unconstrained by caregiving duties. This assumption produces indirect but consistent disadvantage. The claim that digital platforms operate neutrally is further challenged by everyday experiences. For example, a driver named Yanti often informs passengers in advance that she is female, leading to frequent cancellations. While the system records these cancellations, it does not capture the gender bias behind them.
Safety concerns also shape participation. Many women avoid working late hours due to risk, which limits access to peak-demand periods and higher earnings. The system interprets this reduced availability as lower productivity. Scholars such as Virginia Eubanks have argued that automated systems frequently replicate and amplify existing social inequalities rather than eliminate them.
Similar patterns have been observed in other countries. In India, women working in ride-hailing services report lower average earnings, partly because safety considerations influence when and where they work. Algorithms, however, measure output without accounting for these risks.
This article has been indexed from CySecurity News – Latest Information Security and Hacking Incidents
