Creating Technology as a Tool

Technology is completely ingrained in our lives.
What benefit does this offer us?

As anyone who regularly uses technology intuitively understands, there are advantages and disadvantages to being constantly connected.

I really appreciate that I can communicate with my friends and family living on the other side of the world and am delighted any time my phone delivers me a notification from one of them. I love that I can utilize specific offerings to improve myself, and leverage language apps to help me learn (or more often, translate) German and better navigate living in Zurich. I, however, also receive emails from strangers trying to scam me. I receive notifications from social media applications designed to be addictive asking for my attention.

The dichotomy between these kinds of notifications and tech-based interactions begs the question: why is there such a vast and obvious array of technology’s impacts? I think that this question only addresses the surface-level symptom, however. I think the causational consider would instead deem the question to be: who determines how a user interacts with technology? Is it the user or the company behind the tech? Mainstream audiences are beginning to point the finger towards the companies. In what ways can our technology subconsciously manipulate us? In what ways is this manipulation consciously applied? What are the potential economic (read: selfish) intentions? What are the ramifications on the quality of life for the user? This is something users and producers alike must consider.

“who determines how a user interacts with technology? Is it the user or the company behind the tech? I believe that the creators of each new technology, be it hardware, software, or a front-end app, must take responsibility for what they are producing”

I believe that the creators of each new technology, be it hardware, software, or a front-end app, must take responsibility for what they are producing. Engineers, designers, and developers must consider the potential impact of what they create. This needs to be a discussion from the very beginning of the production stage. Those producing tech must consider the potential societal ramifications of their actions and must stop and ask: how can this both used and misused? How can the design or use case be adjusted to increase the probability of a positive sociological affect?

The responsibility of technological management, however, is not limited to the technology producer or distributor: it is also on the user. Those consistently using technology must take responsibility for their own exposure. Learning media literacy and technological manipulation techniques is a good place to start, to understand the motivations behind every notification.

Whether or not technological innovation has historically had a positive or negative affect has always been a hot topic. With every major innovation, there are positives, negatives, and passionate people behind each argument. There are communities which benefit from each technological development, and communities which suffer consequences. These ground-breaking discoveries include the industrial revolution spurred by the combustion engine, or the amount of lives saved from the discovery of insulin.

These are not inherently good or evil technologies, but instead subjectively beneficial or destructive applications. With every major jump forward, technological developments lead to potential positive and negative applications. Nuclear atom manipulations led to both fantastic energy production methods as well as extremely dangerous weaponry. The question therefore is not ‘should we keep going?’ but ‘how do we ensure this has a beneficial destination?’

“With every major jump forward, technological developments lead to potential positive and negative applications. The question therefore is not ‘should we keep going?’ but ‘how do we ensure this has a beneficial destination?’”

New technologies should be thoughtfully designed for positive large-scale impact, shifting the likely applications towards the betterment of society instead of a potential Orwellian downfall.

Here, I talk not just of technology to be made or currently in progress. We must extend this idea to the technology we are currently using as well. I am genuinely concerned of us entering a Brave New World, with technology supplying the utopic sedative which lulls us into inaction and distraction. Are the technologies and social medias I regularly interact with improving my mood and relationships, or are they distracting me from new experiences and more genuine conversations? How can I change my habits and dependencies to use technology as a tool to better myself? How can I take advantage of all the wonderful potential technologies currently out there to improve my life? What can I do to ensure that I am not simply a data production tool technology companies use for economic gain?

These are questions and discussions we must consistently have at IDUN Technologies. We are developing technology to quantify and predict emotional states. The potential advantageous emotional recognition, communication, and management applications are uncountable. Unfortunately, the same could be said for the way this kind of technology could potentially be manipulated for “the greater good.” By constantly considering the questions I have asked here and considering product growth and scalability assuming ourselves to be both designers and users, we hope to steer the emergence of this technology in the right direction. We actively work to avoid becoming the senior social media developers interviewed in The Social Dilemma, who express distress around the harm their contributions have made to society. Most importantly, we consider how problematic technology impacts society, discuss which decisions led the impressive technology to this specific application, and try to identify other potential paths to success.

“remember the common adage in tech: if you are not paying for a product, you are the product.”

Investing time and effort as a designer or a user into discussing these topics is the first step towards ensuring that you personally benefit from the technology you use. To design for positive applications or select tech which positively affects you, one must first identify impact and recognize manipulation strategies. I am not against technology, but against the misuse of technology. Humans using technology should be beneficiaries, not a consumer profile whose attention data is harvested by AI. Suicide rates are up, and time spent away from a screen is down. It is time to take this seriously. If you recognize that some of your technology is harmful, misleading, or manipulative, turn off the notifications or delete the app. When you are buying products or investing in new technology, make sure that the service it provides is beneficial for your mental state. Lastly, remember the common adage in tech: if you are not paying for a product, you are the product.

Hello everyone!

Thank you for reading my blog post. My name is Abby Holland. I graduated of biomedical engineering from Queens University in Canada. At Queens, I led the NeuroTechX student chapter for three years. I held also an internship at muse, working on the research and design for the Muse S. Currently I am the Test Engineer for IDUN Technologies. Finally, I am also working as the lead for the NeuroTechX Diversity Initiative and helping organize the newly formed NeuroTechX Zurich Chapter.

Do you want to get in contact and discuss the topic? Connect me on linkedin or write me an email!

Stay tuned and follow
us on our socials!
LinkedinInstagram

Where to find us

We are located in the outskirts of Zurich, near the airport. We always welcome drop-in visits!

Get in touch

Error: Contact form not found.