{"id":38375,"date":"2023-10-20T14:56:05","date_gmt":"2023-10-20T14:56:05","guid":{"rendered":"http:\/\/startupsmart.test\/2023\/10\/20\/algorithms-are-everywhere-how-will-they-shape-you-startupsmart\/"},"modified":"2023-10-20T14:56:05","modified_gmt":"2023-10-20T14:56:05","slug":"algorithms-are-everywhere-how-will-they-shape-you-startupsmart","status":"publish","type":"post","link":"https:\/\/www.startupsmart.com.au\/uncategorized\/algorithms-are-everywhere-how-will-they-shape-you-startupsmart\/","title":{"rendered":"Algorithms are everywhere, how will they shape you? – StartupSmart"},"content":{"rendered":"
\"\"<\/div>\n

As algorithms become entrenched into society, the debate about their effects rages on.<\/p>\n

In essence, algorithms are sequences of instructions used to solve problems and perform functions in computer programming.<\/p>\n

As mathematical expressions, algorithms existed long before modern computers.<\/p>\n

While they vary in application, all algorithms have three things in common: clearly-defined beginning and ending points, discrete sets of \u201csteps,\u201d and design meant to address a specific type of problem.<\/p>\n

And problems we have.<\/p>\n

On the one hand, algorithms play the role of prime suspect \u2014 responsible for the recent UK pound\u2019s Brexit-induced flash crash<\/a>, used for political and informational manipulation<\/a> on social networks, and part of what Harvard Professor Shoshanna Zuboff calls \u201csurveillance capitalism\u201d<\/a>.<\/p>\n

On the other hand, algorithms make modern life easier: they help us find information, detect disease, connect us to friends and family, show us products we\u2019re likely to be interested in, recommend personalised experiences, and direct us around traffic delays, saving us valuable time and money.<\/p>\n

Algorithms are everywhere<\/h3>\n

Much has been written on what algorithms do and how they affect us.<\/p>\n

This includes how algorithms secretly control us<\/a>, what types of information they filter in or out<\/a> of our social media feeds, and the thousands of calculated outcomes they force on us daily<\/a>.<\/p>\n

This piece isn\u2019t about these issues, or about breaking down the complex nature of how algorithms work.<\/p>\n

The relevance of algorithms at the moment isn\u2019t because they are used in Google\u2019s search, maps, autocomplete, photos, and translation services; Facebook\u2019s news feed and Trends; Twitter\u2019s trending topics; Netflix\u2019s movie recommendations; Amazon\u2019s prices and product reviews; or for predicting hurricanes, creditworthiness, and home and car insurance liabilities.<\/p>\n

It\u2019s not because most computer software and mobile applications are essentially bundled packages of algorithms.<\/p>\n

To return to my very first point, algorithms are important because they are the key process in artificial intelligence: decision-making<\/em>.<\/p>\n

AI_gorithms<\/h3>\n
<\/figure>\n

Algorithms, in a sense, are the \u201cnervous system\u201d of AI.<\/p>\n

They are the models that underpin machine learning, prediction, and problem solving.<\/p>\n

Yet, as many researchers argue, due to their design by humans, algorithms can never be neutral<\/a>.<\/p>\n

\n

algorithms are not neutral by definition
algorithms are not neutral by definition
algorithms are not neutral by defi
https:\/\/t.co\/dckOayKybG<\/a><\/p>\n

\u2014 Casey Johnston (@caseyjohnston) May 17, 2016<\/a><\/p>\n<\/blockquote>\n

As Vint Cerf, co-inventor of the Internet Protocol, Turing Award winner, and Google VP pointed out in a recent speech at Elon University<\/a>:<\/p>\n

\n

\u201cWe need to remember that [AI systems] are made out of software. And we don\u2019t know how to write perfect software \u2026 the consequence is that however much we might benefit from these devices \u2026, they may not work exactly the way they were intended to work or the way we expect them to. And the more we rely on [AI systems], the more surprised we may be when they don\u2019t work the way we expect.\u201d <\/p>\n<\/blockquote>\n

\u201cThe way we expect\u201d is key here, because algorithms are a computer-simulated reflection of encoded human expectations.<\/p>\n

Engineering memories<\/h3>\n

Facebook\u2019s famous \u201cOn This Day\u201d prompt involves \u201cengineering for nostalgia<\/a>\u201d.<\/p>\n

Likewise, Instagram algorithmically sorts its timeline so you \u201csee the moments you care about first<\/a>\u201d.<\/p>\n

The more we, as humans, rely on algorithms, the more our reality becomes encoded with other people\u2019s flawed expectations.<\/p>\n

As more AI-powered systems come online, this type of calculated bias will permeate every level of our lives \u2014 even our memories and past experiences.<\/p>\n

Take, for instance, Google Photos, which uses AI-powered \u201cdeep learning\u201d to organise people\u2019s photos beyond normal metadata (GPS, time, date, lens, etc.).<\/p>\n

It uses advanced machine learning algorithms to classify material objects, facial expressions, and emotional relevance.<\/p>\n

The robotic \u201cassistant\u201d even can touch up images, suggest creative filters and create photo albums automatically<\/a>.<\/p>\n

Biased learning, troubled future?<\/h3>\n

As algorithms \u201clearn\u201d more about us through our financial data, location history, biometric features, voice patterns<\/a>, social networks, stored memories, and \u201csmart home\u201d devices, we move towards a reality constructed by imperfect machine learning<\/a> systems which try to understand us through other people\u2019s expectations and sets of \u201crules\u201d.<\/p>\n

Algorithms are the literal manifestation of \u201cplaying by someone else\u2019s rules”.<\/p>\n

For dating app Tinder\u2019s algorithmic \u201cSmart Photos\u201d matching, the rules of successful engagement on Tinder<\/a>\u00a0are made clear, and enforced on users.<\/p>\n

Does this mean that we live inside a\u00a0computer simulation<\/a>?<\/p>\n

I\u2019ll defer that\u00a0question to Elon Musk,<\/a> who has said, \u201cthere\u2019s a billion to one chance we\u2019re living in base reality\u201d.<\/p>\n

Cerf, however, warns that it\u2019s a mistake to \u201cimbue artificial intelligences with a breadth of knowledge that they don\u2019t actually have, and also with social intelligence that they don\u2019t have\u201d.<\/p>\n

The algorithmic end game, AI, will get better with time, but it will always be flawed.<\/p>\n

Even in straightforward applications like a game of chess, algorithms can leave people clueless as to how they arrived at a certain outcome.<\/p>\n

Great expectations<\/h3>\n
<\/figcaption><\/figure>\n

Cerf talked about a scenario in which IBM\u2019s \u201cDeep Blue\u201d supercomputer, playing world chess champion Gary Kasparov<\/a>, made a move that Kasparov could not understand.<\/p>\n

\n

I mean, it made no sense whatsoever. And he was clearly concerned about it, because he thought for quite a long time and had to play the endgame much faster \u2026 in the end it turned out it was a bug<\/a>.<\/p>\n

It was just a mistake. The computer didn\u2019t know what it was doing. But Kasparov assumed that it did, and lost the game as a result. <\/p>\n<\/blockquote>\n

The implications of bias today might result in poor neighbourhoods experiencing\u00a0more police brutality<\/a> because of predictive data modelling.<\/p>\n

Tomorrow, it will mean people die when the algorithms controlling self-driving cars are programmed to save the occupants lives\u00a0instead of pedestrians<\/a>.<\/p>\n

Bad or good?<\/h3>\n

Is the social use of algorithms inherently \u201cbad,\u201d provided they form the basis of \u201cintelligence\u201d in AI?\u201c.<\/p>\n

David Lazer, a computer scientist at Northeastern University, is sceptical.<\/p>\n

In a recent Science<\/em> article<\/a> he said:<\/p>\n

\n

The fact that human lives are regulated by code is hardly a new phenomenon. Organizations run on their own algorithms, called standard operating procedures. And anyone who has been told that “it\u2019s a rule\u201d knows that social rules can be as automatic and thoughtless as any algorithm. <\/p>\n<\/blockquote>\n

It does mean that companies, governments, and institutions that employ algorithms, and soon, AI powered deep learning \u201cneural networks\u201d need to be more transparent in showing us how the algorithms they use might affect our reality<\/a>.<\/p>\n

\n

A Google project called Magenta is aimed at making more sophisticated kinds of creative software. #EmTechMIT<\/a> https:\/\/t.co\/f6rOmL2yhR<\/a><\/p>\n

\u2014 MIT Tech Review (@techreview) October 18, 2016<\/a> <\/p>\n<\/blockquote>\n

Given how proprietary algorithms are the\u00a0new business model<\/a>, this is doubtful, even despite current laws preventing algorithms from being patentable<\/a>.<\/p>\n

A recent SSRN piece<\/a> maintains the need for a \u201cFood and Drug Administration for algorithms”.<\/p>\n

Some scholars go so far as to argue that algorithms need\u00a0managers too<\/a>.<\/p>\n

According to Cerf:<\/p>\n

\n

It\u2019s a little unnerving to think that we\u2019re building machines that we don\u2019t understand \u2026 Not only in the technical sense, like what\u2019s it going to do or how is it going to behave, but also in the social sense, how is it going to impact our society?
Just like us<\/p>\n<\/blockquote>\n

Just like us<\/h3>\n

So, algorithms, the underlying process of decision making in artificial intelligence systems are imperfect, prone to bias, and make unpredictable decisions that impact the future.<\/p>\n

\"ERROR\"<\/p>\n

Sound familiar?<\/p>\n

This article was originally published on The Conversation<\/a>.\u00a0<\/em><\/p>\n

Follow StartupSmart on<\/em>\u00a0Facebook<\/a>,<\/em>\u00a0Twitter<\/a>,\u00a0LinkedIn<\/a>\u00a0and iTunes<\/a>.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"

As algorithms become entrenched into society, the debate about their effects rages on. In essence, algorithms are sequences of instructions<\/p>\n","protected":false},"author":2,"featured_media":61690,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[7,1],"tags":[],"_links":{"self":[{"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/posts\/38375"}],"collection":[{"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/comments?post=38375"}],"version-history":[{"count":0,"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/posts\/38375\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/media\/61690"}],"wp:attachment":[{"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/media?parent=38375"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/categories?post=38375"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.startupsmart.com.au\/wp-json\/wp\/v2\/tags?post=38375"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}