A strange thought occurred to me… If I went out and completely automated our entire company with robots—coding robots, data science robots, and human resources robots (I guess that would be robot resources robots)— they would all show up on our balance sheets as assets. Humans, on the other hand, would show up as liabilities. The solution is obvious. The dawn of the age of robots has arrived. Long live the robots.
Of course, this version of history, while a constant threat to the status quo, has yet to prove itself viable. As close as we get to replicating the human mind with technology, it appears too asymptotic. A future we are constantly heading towards but never arrive. As if there is something about us, about life, about humanity, about organic structures that is truly unique and unreproducible.
But is it?
We’ve reproduced computation, network analysis, deep learning, image-to-text, face recognition, object recognition, simultaneous translation, artificial intelligence, deep neural networking, autonomous movement… and that’s just the stuff I know about. Automation is a reality that has and will continue to affect our lives, and our economies, as we develop these technologies. Automation in manufacturing has been an ever-present reality for decades but now it seems as though the limits of which industries that are automation-able are unknown. Farming, medicine, dentistry, architecture, engineering, policing, national defence and security… the list goes on.
What is it about us that is truly irreplaceable?
May I be so bold as to suggest three things? Creativity, Responsibility and Accountability.
Humans, unlike robots, can create things from raw materials (I see that hand, give me two minutes and I will get to your question.), take responsibility for that creation, and are held accountable for the results of that creation.
Now, great question. I know that there is AI that can, do, and have designed things. I would suggest that designing and creating are two distinct processes that are held mutually exclusive from one another. Human beings have an intrinsic ability to create new things with only raw materials. So much so that they can create things that have the extrinsic ability to design new things, but those AI have been created with the ability to scale and multiply human ability.
Responsibility as defined by the dictionary is, “the state or fact of having a duty to deal with something or of having control over someone”. Subsequently, duty is defined as “a legal or moral obligation”. By definition, a robot cannot have a legal or moral obligation because robots have no legal standing under the law. And in order to have a moral obligation, we’d have to a conversation about robots and existentialism that I’m not geared up for right now. Robots complete tasks, they do not, and cannot, have responsibility.
Accountability is defined by the dictionary as, “required or expected to justify actions or decisions.” Is it justifiable to ask a robot to justify their actions or decisions, when the basis for those actions or decisions was defined initially by parameters given it by a human? What’s more, it is impossible, for reasons I don’t even understand, for robots to hold accountability for anything. Robots, though certainly not infallible, cannot be censured, reprimanded, demoted or fired. If anyone tried, they would look incredibly foolish to everyone around them.
Given that there are no further questions, and we are all in agreement, it is the requirement for all organic structures to endow each individual structure with creativity (the agency of creating new things from raw ingredients), responsibility (the duty to deal with something – and may I add, something meaningful), and accountability (the expectation that they would be asked to take actions they would have to justify). And the crazy thing is, those are the very things that your people are begging for.
[/et_pb_text][/et_pb_column][/et_pb_row][/et_pb_section]