With each passing year, I more deeply wrap myself in a veil of irony when considering my graduate field of study. This is not a condemnation, necessarily, but more a realization that my first field of study more adequately prepared me for my career than graduate school ever could. While the field of design continues to validate its necessity in these complex times, what is more urgent in this time of exponential technological advancement is a nuanced understanding of the factors, processes, and constructs that dictate and define culture. Anthropologists save the day again.
With the rapid evolution and dissemination of artificial intelligence capabilities across all industries, we’re collectively reaching a tipping point where sophisticated machine learning algorithms steadily dictate how we engage, interact with, and perceive the world around us while automating many aspects of our daily lives (or replacing us outright). These forces will be extremely disruptive and will have irrevocable effects on we, the hairless apes wandering the wilds of the new AI-enabled technocracy (claptrap alert). I won’t waste time moralizing the growth of artificial intelligence and automation as there are many, many conflicting opinions on the role AI will play in the future (for and against) formed by people much smarter than I. In my weaker moments, I too am enamored by the notion of a utopian society where AI-enabled automation frees us to pursue more meaningful endeavors as machines take on perfunctory low-order tasks. I tend to fall more on the “change spectrum”, where liberation from our daily drudgery is not removed outright, but rather the nature of our efforts change to fit our new reality. I, like many others, fear the rise of the killbots, but am far more concerned about the tedium this new reality could create – a far greater threat to global stability and well-being. Of course, none of this is new. Fear of technology replacing human jobs and their corresponding stability is a hallmark of the post-industrial world. It’s deeply embedded in American folklore. In some cases, the resistance is an assiduous stubbornness born from our own manufactured beliefs in the virtue of effort. In most, it’s the very real and practical fear that one will no longer be able to provide for themselves or their families. Much needs to be done to enable smooth transitions into new domains, and it’s debatable whether we are currently doing enough to prepare us for an automated future. Naturally, the subject of our resistance is rooted in our own collective nature – that pioneering spirit that has always moved humankind forward, often haphazardly, is the same spirit that drives technological advancement, often devoid of consideration of the social, cultural, or economic ramifications of that evolution. This persistent push-to-the-new approach, with its much-heralded champion, innovation, pre-supposes that all technological advancement is inherently good (without even a passing acknowledgment of the horrible things (pithy) that have occurred as the result of technological advancement devoid of thoughtful consideration of the possible outcomes). Further, this advancement, or what can, often laughably, be referred to as progress, only appears as such in the eyes of its progenitors. This mismatch is the result of the biases at play in the creators of new technologies, and how those biases manifest in the technology itself (and become thrust upon the rest of us). Our biases compound across different levels of societal stratification – by nation, region, locality and even familial constructs. These biases need not be insidious by their nature. Some are just entrenched worldviews or perspectives on how the world does or should operate – sometimes formed with the best available evidence. That makes their effect no less profound. Indeed, the road to hell is often paved with good intentions. Not even the noblest and pure (often in absentia in tech) are immune, and the biases that influence their decision making will continue to cascade into the technology controlling our world, determining whether your car will decide to kill you, whether you are eligible to receive credit or loans, or worse. In an automated world, the scope and scale of these biases will have increasingly broad implications on society. In the past two decades, we’ve seen a surge of anthropologists (and all stripe of social scientists) employed in industry (for lots of good reasons), working to bring cultural insight to the forefront in the design of products and services. Now, we need to cast our gaze inward to meaningfully assess the culture of companies. We need to understand how these cultures are influencing the delivery of products (of all stripes) and how these cultures support (intentionally or not) the insertion of bias into the products themselves. In the coming age, it is integral to shape the culture of tech companies and how they approach the design and development of artificial intelligence: To create standards and practices that promote introspection, diverse perspectives, and awareness of the way individual and collective biases influence the products and services we create. We may never remove bias from our natures, but we can at least prevent them from seeping into the technology we increasingly rely on.
0 Comments
|
EldridgeI'm alternately fascinated, appalled, and enthralled by the intersections of society, technology, and business. Archives
September 2019
Categories |