Francesca FajardoBiography
My Data Isn’t Even Mine

I’m eighteen, I’ve had a phone since I was eleven and computers have been an ordinary object to own in my lifetime. My generation was the first to think nothing of a smart phone as a first phone. We possess phones owned by companies that know more about us in our teen years than our own parents. This is generation Z.

The generation prior was lucky to have had a brick phone and a home PC in their late teens.

Generation X (mainly the parents of generation Z, branded the tech generation) grew up pre-Apple, pre-Google and on the advent of new, time saving devices and applications, jumping full body towards the technological river, soon integrating newer technology in old institutions, schools, hospitals in other personal and professional capacities.

The more we became reliant on unchecked systems, the harder it became to hold them accountable.

We are held in a form of Stockholm Syndrome with our apps and manufacturers. Ordinary people don’t have an option to refuse to input their data. If you’re not on the work WhatsApp group, you won’t be privy to changes in schedule. If you won’t input your data when searching for a job at the job centre, you will be sanctioned.

Once our medical records told the story of our health, now it’s our search engines holding on to our every symptom: deepening our paranoia by signposting us more symptoms of related diseases. In most cases, we are not diseased. The system however is! When our concern means our “clicks” and our clicks mean profit, ethics quickly disappear. Legislation is not keeping up with the speed of technology and, as with any unregulated revenue source, poor ethics are trumping decency.

After researching my mother’s arthritis, I was signposted towards links for medical CBD, vitamin supplements, arm supports and menopause advice.

Educing and then abetting paranoia must be seen for what it is, and not heralded by an entire industry as enterprising.

Our own NHS is falling victim to the data savagery that is now the most profitable resource even ahead of oil, according to a 2017 article in The Economist. The oil industry is a good indicator of what power data companies hold. Oil and its associated riches have been the cause of war and carnage and misery since its inception. The very fact that we have this parallel with which to compare the data industry, should make us more cautious of how we approach it. On a personal note, I am fearful that any individual actions are meaningless when the companies update their policies and data usage to keep the law two steps behind. 

I am not ashamed to say that techno jargon washes a foot above my head, but no one wants to admit that they do not read the terms and conditions before they click “ok”. We need these services: we rely on them to organise protests against governments and group chats of governments themselves. 

Ads are targeted towards us, based on our political preferences; not just our choice of take-away. The role of Cambridge Analytica in the Trump and Brexit campaigns was in combing through data and targeting those susceptible to be influenced in the interest of the campaigns funding Cambridge Analytica.

Like our shame admitting our inability to check every “cookie” notice, we are similarly ashamed to admit that our opinions and thoughts can be influenced. This induced shame is keeping us silent. We feel personally accountable. We have been told to feel personally accountable despite most of us being separate from the class of people designing our digital space. As participants in the virtual climate, we deserve to understand in layman’s terms, what the bloody hell is going on.

Compiling data on a demographic with common interests and then predicting their future actions based upon past actions of others within the same group, could be harmful to minority groups already stigmatised.

Almost everyone I know belongs to a stigmatised section of society, be it they are poor, disabled, LGBT, BAME, etc., and none of them wish to be judged by the interests of others from their group. Data stereotypes us based on the demographic the algorithm assigns to us. Though, in many cases the algorithm gets it right, it can hurt to see your interests placed in front of you as a stereotyped version of what the algorithm dictates your interests to be. In the United States, Senator Alexandria Ocasio-Cortez spoke of facial recognition systems deployed by criminal justice agencies to locate criminals and immigrants perceived as illegal. Her findings illustrated the inability of the algorithm to distinguish between non-white faces. Most employees of data companies are white male and heterosexual. It is not wild to assume that that has an impact on the consensus formed by those working for the industry. As data companies do not have a diverse employ, how are their conclusions applicable to diverse populations? The assumption by those sharing common consensus isn’t necessarily applicable to those excluded from the mainstream (who are vastly under-represented) and in the case of data, this can mean being left out of systems entirely.

Google relies on the power of suggestion. We subscribe to click bait as it releases serotonin in our brains. Our bio-chemistry is being used against us like a drug and the pharmacist is getting ever richer at our expense. 

There is legislation on alcohol, drugs, gambling, etc., but money-making click bait websites, causing the same endorphin releases and hooking us, just as gambling does, remain free to entrap us for hours on end, as we desperately try to get our fix of feel good hormones. 

Companies are playing with our senses: making us feel good, then pushing us to spend more and more time on websites; giving more and more data, absolutely not to our benefit but lining the pockets of big data, meanwhile taking in adverts and considering a new pair of shoes. 

They take our data and while they extract it, they suggest we spend some money. They are doubly quids in whereas we are doubly broke. Robbed first of our privacy and then of our funds. Data is capital and they are stealing it. 

Keeping us on such a narrow track of interests (or rather the algorithms’ perception of our interests) will create ghettoised online communities. There is already an enormous problem amongst young men terming themselves as ‘incels’. Surely, there must be a burden of responsibility not just to show people like-minded discussions, but things outside of their immediate interests. 

Our news feeds are targeted, based on interests, this means we only see some, not all news. We feel informed, but are denied from growing by data’s profit made by hemming us in. It’s tried and tested: if we like it, we click; profit ensues. We remain in echo chambers. Vulnerable people remain ghettoised by the algorithm. The algorithm makes us feel as though our opinion is the dominant one, everyone agrees with us, lulling us into a false sense of security. This leaves us unprepared for encountering those with whom we disagree. We obtain most, if not all of our information from the digital landscape, yet there is less legislation on digital companies and how they obtain information, than on conventional media. They must be held accountable. Deregulation and laissez faire policies mean ethics are a thing of fantasy: they are down to the individual interpretation of the companies. 

Our data is being pedalled away by conmen and that is not ok.

Francesca Fajardo is a young person who took part in 5Rights Foundation’s Data Literacy workshops last year. Francesca is self-admittedly as reliant upon tech as is possible to be!