Predestined by Code
Fact
“In 2013 Eric Loomis was pulled over by the police while driving a car that had been involved in a shooting incident. Despite being completely innocent, he was arrested and taken to court where he pleaded guilty to attempting to flee an officer and operating a vehicle without the owner's permission. As per his crime, he did not deserve prison time, but to his utter shock, he was sentenced to 11 years, with six years to be served behind bars and the remaining five under extended supervision.“
What was even more shocking was that the sentence was not the decision of a judge or a jury of his peers but because an algorithm determined it. The judge used the Correctional Office Management Profiling for Alternative Sanctions risk assessment algorithm, or COMPAS, to determine that Eric had a high risk of recidivism. COMPAS was used to assess Mr. Loomis's risk of recidivism. The judge presiding over the case relied on this algorithm to determine the severity of the sentence, effectively denying Mr. Loomis his right to a fair trial. It was a gross miscarriage of justice.
Prologue
As I sit in my office, the neon lights from the street below casting an eerie green glow on my keyboard, I can't help but think about the past. The world was different back then, before artificial intelligence and algorithms determined the fate of a person’s life. Before I got involved with cybercrime and law enforcement. Before the world knew what predictive policing was.
It's a strange feeling, to have your work praised and celebrated, but to feel like an outsider in the world around you. I was always alone, with only my thoughts to keep me company. It was as if I was living in two different worlds - one of circuits and code, and the other of flesh and blood. I remember growing up feeling like I didn't belong anywhere. My parents were never around, and when they were, they didn't understand me. They saw my interests as a waste of time, a distraction from the real world. But I didn't mind. I found solace in the world of technology and machines. It was my escape from the harsh reality of my life. I spent hours on end in front of a computer screen, devouring every bit of information I could find about technology and programming. It was my passion. And eventually, my self-taught skills in coding led me to the dark alleys of cyberspace, where I honed my craft. Delving into places I shouldn't have gone. I thought I was invincible, untouchable. But eventually, I was caught. And that's when everything changed.
Detective Liam Parker was a tall man with an imposing figure and deep brown eyes. He interrogated me that faithful night, saw my skills, my knowledge of the inner workings of cyberspace, and saw potential. He gave me a choice - go to jail or join the force. It was a turning point in my life. I had a purpose, a direction. I was no longer aimless and alone. It was a second chance, a way to turn my life around and use my talents for good. And over the years, I've done just that.
Chapter 01
Central City, 2030
The year was 2030, and the city was awash with a new type of lawman. Not the kind that carried a badge and a gun, but the kind that carried lines of code and algorithms.
Predictive Law Enforcement was supposed to stop crime before it happened, but something wasn't right. The streets were quieter, but there was an undercurrent of tension. The people didn't know what to make of preventative policing. Some welcomed it, but others saw it as an invasion of privacy and a step too far.
As I sit here now, I can't help but wonder if this is what the future holds. A world where algorithms and machines dictate our lives, and human connection is a luxury we can no longer afford.
A world where success is measured by the efficiency of our work, and the price of progress is our very humanity. It's a strange feeling, to be both celebrated and condemned for the same thing. But such is the life of a cybercrime detective in this age of artificial intelligence.
My name is Jake Witwer, and I'm the head of cybercrime in the central city police department. It's my job to oversee the development of a controversial new program.
They put me in charge of the whole setup, the new frontier in crime prediction: Predictive Policing. This system is the natural evolution of COMPAS, moving from risk assessment to a full-blown precrime operation.
As tech advanced and data became more ubiquitous, the law was bound to get a whiff of the potential for criminal prognostication. I've seen some wild tech in my time, I've even engineered a few predictive tools myself. But this new system? It's got me feeling like I'm walking a tightrope without a safety net.
In the early days, COMPAS served as a tool for pretrial release, sentencing, and parole decisions. Judges and parole boards became heavily reliant on its recommendations, and a defendant's "risk score" became a key factor in determining their fate. From the beginning, COMPAS was integrated into the criminal justice system to help make crucial decisions. The public was left in the dark about how the machine came to its conclusions, as the engineers behind it refused to reveal the algorithm's inner workings. It's a troubling thought, that we would entrust the outcome of a person's life to a mere algorithm, even in the face of glaringly inhumane recommendations. What does it say about society when we are willing to put so much faith in technology?
And then there was the issue of privacy. The more data the machine had access to, the more it could predict. As someone who had been at the forefront of creating AI applications for law enforcement, I couldn't help but feel a sense of unease about the direction things were headed. It was clear that we needed to rethink the way we approached crime prevention and criminal justice as a whole. But as the saying goes, "You can't put the genie back in the bottle." The technology was already out there, and the justice system was dependent on it.
The newly renovated building loomed before me, poised to become the hub of Predictive Policing. It was a cold, sterile environment, with servers humming incessantly in the background, their output doing little to warm the space. The air was full of the energy of the new recruits, all fresh out of college with their shiny degrees in computer science and data analysis.
I was supposed to be leading them, responsible for the algorithms that would determine who was a potential criminal before they even committed a crime.
They looked up to me, as their leader, but I couldn't shake the feeling of distrust. I was a man of evidence and investigation, but now, the evidence was reduced to lines of code, and the investigation was conducted by a machine. It was a brave new world we were entering, one where the investigation was done by a machine and the algorithm was the only evidence.
As I step into the office, the staccato rhythm of fingers hitting keys fills the air. Each person is huddled over their computer screens, lost in their own world of data and algorithms. Amidst them, I catch the sight of Henry, the head of COMPAS, who motions me over with a wave of his hand.
He towers over me, his angular face and icy blue eyes intimidating. I approach him, feeling a sense of unease wash over me. "Jake, good to see you. I want to discuss the new system," he says in a commanding voice, leaving no room for disagreement. I give him a curt nod, trying to hide the reluctance that's clawing at me.
Henry is one of the biggest advocates for the evolution of COMPAS into PRECRIME. He thinks it's the future of law enforcement, but I'm not so sure.
"What's the problem?" I ask.
"We've had some pushback from the community," he says, leaning back in his chair. "They're worried about their privacy, about the government spying on them."
I nod, understanding their concerns. The new system can track everything from purchases and social media to carbon footprint and medical information. I can see why people might be nervous. Henry’s expression turns serious as he leans forward, his eyes locked onto mine. "But we can't let fear hold us back," he says, his voice firm. "We have to push forward, embrace the technology, and use it to make our communities safer." I can't argue with his logic, but something about it makes me uneasy. It's one thing to use technology to aid investigations, but to use it to predict crime before it even happens. It feels like a violation of the principles of justice.
"Listen, Henry," I say, my voice low.
"I understand the potential benefits of PRECRIME, but we need to be careful. We can't let our reliance on algorithms replace good old-fashioned detective work. There are too many variables that can't be accounted for in the data." Henry nods, his expression thoughtful. "I see your point," he says. "But we have to start somewhere. And this is the logical next step for COMPAS."
I sigh, feeling the weight of the responsibility on my shoulders. "I know," I say. "But we need to tread carefully. This is uncharted territory, and we can't afford to make mistakes." I leave Henry’s office, feeling uneasy. I know this new technology has the potential to do a lot of good, but at what cost? And what if the algorithm is wrong? What if it accuses an innocent person of a crime they didn't commit? I have a feeling that the consequences of this new system will be far-reaching and unpredictable. And I'm not sure I want to be around to see them.
It was late Thursday when I received the status update from Lucas, the head of neural networks.
“COMPAS Integration - PRECRIME Directives” was the subject line.
There was a part of me that didn't like the sound of it. I lit a cigarette and leaned back in my chair, reading over the memo and the development of the most recent protocol. I'd seen it all - but this was different. I got out of my chair and looked out the window, watching the city that was being transformed before my very eyes. The neon lights flickered like artificial stars in a sky that never slept, and I wondered if we were sacrificing too much for the sake of progress. The city never sleeps, but neither do the algorithms.
The television screen flickers to life, casting a dim glow across the room. A familiar face appears on the screen, that of a news anchor whose voice fills the silence of the office –
"
Good evening, and welcome to Channel 1 News. Tonight, we'll be discussing the controversial new proposal to use predictive policing and artificial intelligence to combat crime. Supporters argue that this could lead to a safer and more efficient society, but opponents have raised concerns about the potential for bias and abuse. Our panel of experts will be weighing in on this topic tonight.
In today's world, the idea of predictive policing and the use of artificial intelligence to fight crime has become a hot-button issue. Many people are calling it a game-changer, while others fear it could lead to a dystopian future. In our first segment tonight, we will be talking about the rise of predictive policing and the ethical implications it presents.
“
I turn off the television, deep in thought. The debate surrounding the use of AI in law enforcement continues to grow louder, and I find myself at the center of it all. As the head of cybercrime, I am responsible for overseeing the implementation of this controversial program. But deep down, I question whether we are crossing a line. Can we truly rely on algorithms to predict human behavior? Are we sacrificing privacy and individual freedoms for the sake of a perceived sense of security?
The days and nights blend together as I delve deeper into the complexities of this new system. The pressure is immense, and the weight of responsibility rests heavily on my shoulders. Every decision I make could have far-reaching consequences for the lives of countless individuals. I find solace in the quiet moments, in the solitude of my thoughts. But there is always an underlying sense of unease, a nagging doubt that lingers in the back of my mind.
As I continue to navigate this uncharted territory, I am reminded of the words of a renowned philosopher: "With great power comes great responsibility." The power of AI and predictive policing is undeniable, but it is our responsibility to ensure that it is used ethically and justly. We must strike a delicate balance between utilizing technology to enhance our efforts and preserving the core principles of justice and human rights.
In the ever-evolving landscape of AI and law enforcement, the road ahead remains uncertain. It is a journey fraught with moral and ethical dilemmas, where the line between progress and the preservation of our humanity becomes increasingly blurred. As I walk this tightrope, I can't help but wonder if we will find the right path or if we are simply sliding further into the abyss of technological dominance. But for now, I press on, driven by a deep-seated desire to make a positive impact.
Chapter 02
Jake Witwer, a man of lifelong curiosity and fascination with artificial intelligence, immersed himself in science fiction novels from an early age. He was captivated by the boundless potential of AI, teaching himself to code and master computer systems. Even as a teenager, his exceptional hacking skills and deep understanding of computer science set him apart.
As Jake transitioned into a cyber intelligence detective, he saw an opportunity to explore the convergence of technology, artificial intelligence, and law enforcement. Recognizing the transformative power of AI in bolstering cybersecurity, he devoted himself to uncovering its vast possibilities. His groundbreaking work swiftly garnered attention, establishing him as a leading figure in leveraging AI for law enforcement. He published numerous papers on innovative machine learning approaches and the delicate integration of morals and values into algorithms.
His exceptional work in cyber intelligence and AI caught the eye of government agencies, leading to an enticing offer to spearhead a new law enforcement division: Predictive Policing. As the head of Cyber Intelligence, Jake's task was to build upon the COMPAS source code and incorporate predictive algorithms to identify potential criminals. However, this new system ignited ethical debates and raised concerns about potential misuse.
Jake's superiors recognized his background as a detective and his profound understanding of advanced technologies, deeming him the ideal leader for the Predictive Policing division. They held high expectations for Jake, counting on him to drive the development and implementation of this cutting-edge program.
Police Chief Nathan Foster persistently pursued Jake, urging him to accept the position for weeks. However, Jake harbored reservations about embracing the offer.
The two engaged in thoughtful conversations about the implications of predictive policing and the potential risks, such as false accusations against innocent individuals. Despite Jake's concerns, Foster emphasized the necessity of taking risks for progress and believed that with Jake's expertise, they could ensure the program's accuracy and effectiveness. Foster also highlighted the significant impact such a program could have on crime reduction and community safety.
With the reassurance of strict protocols and the establishment of a Governance Committee to safeguard privacy, Foster convinced Jake to take the position. Jake committed to shaping the policies and ensuring that ethical boundaries were respected.
However, even with his agreement, Jake initially felt hesitant about his new role. Restless nights plagued him as he grappled with the potential ethical and moral implications of predicting crime through technology and the potential misuse of power. His once peaceful slumber transformed into an anxious state, his mind consumed by the potential consequences of their work.
One night, a beep from his phone snapped Jake out of his trance. Chief Foster had sent a message, informing him of an upcoming meeting the next morning. Despite his sleep deprivation, Jake prepared himself to persuade a room full of bureaucrats about the feasibility of predicting crime with algorithms.
As he made his way to the headquarters of the newly formed precrime division, Jake found his mind drifting away from the usual city sights. He pondered the effectiveness and potential implications of the algorithms they were developing. Upon arrival, he was greeted by buzzing drones and flashing computer screens, amplifying his sense of unease.
Within the facility, Jake couldn't shake off his unease as he observed his colleagues deeply engrossed in their work. He proceeded to the conference room, where anticipation and excitement permeated the air. The Chief of Police unveiled the department's plan to utilize predictive algorithms, facial recognition technology, and autonomous drones.
Listening to the unfolding strategies, Jake's heart raced with unease. The rumored changes were more invasive than he had imagined. As the conference concluded, Jake grappled with the growing concern that their department's commitment to justice was being overshadowed by the allure of technology.
Jake Witwer
Chapter 03
The following day, Jake entered Henry’s bustling department, greeted by the symphony of advanced technology: buzzing drones and flickering computer screens.
Despite the excitement radiating from his colleagues, Jake felt a distinct absence of thrill as he made his way to the conference room. The topic of his presentation weighed heavily on his mind—the neural upgrades for the new Precrime directives.
His train of thought was interrupted by the arrival of Lucas, a close colleague. They engaged in a lively discussion about the progress on the protocols, and Lucas discreetly handed Jake a card containing critical information. Jake's eyes focused on a bold crimson slide displayed on the screen: "Directives for Preventing Crimes."
A whirlwind of conflicting thoughts and ideas engulfed Jake, forcing him to confront the moral and philosophical implications of their work. But he knew he had a duty to set those thoughts aside and proceed with the meeting. Taking a deep breath, he stood up, ready to present, and steeled himself for the task ahead.
Jake meticulously explained that the directives were designed to ensure measured, calculated actions. He emphasized the importance of allowing events to unfold naturally before Precrime intervention. The system would monitor individuals, collecting and analyzing data to predict the likelihood of a future crime.
Intervention would only occur when there was a high degree of certainty about an imminent crime. The system took into account factors such as mental state, intentions, and external circumstances to ensure fair interventions.
As Jake scanned the room, he sensed a mixture of excitement and unease among his colleagues. There was an underlying tension, a shadow cast by the weight of their creation. Clicking to the next slide, he continued his presentation.
Protocol 1 : Precrime intervenes only when a future crime is certain and imminent, allowing historical sequences to play out naturally without intervention.
Protocol 2 : Precrime weighs the potential consequences of its interventions, acting only when the benefits of preventing a crime outweigh the potential negative impacts.
Protocol 3: Precrime monitors the impact of its interventions and minimizes any negative effects. Advanced simulations may be employed to predict outcomes and adjust actions accordingly.
Protocol 4 : Precrime may prioritize cases involving individuals with a history of criminal behavior to minimize disruption to natural historical sequences.
After the meeting concluded, Jake couldn't shake the lingering unease as he returned to his desk. Staring at the screen, he realized that he had to confront the ethical challenges of predictive policing head-on. In the ensuing months, Jake grappled with growing doubts about the system he had helped design. However, his sense of duty propelled him to ensure its proper function, even as his concerns deepened.
The world stood at the precipice of irreversible change, and the future of Precrime depended on their meticulous attention to detail.
As the day of final system configuration drew near, Jake's sense of dread intensified. Seated at his desk, he initiated the protocol configuration for the Precrime system. The system's AI voice guided him through the process in a monotonous tone.
"Confirm your identity with your access code," it instructed.
Jake input his code and scanned the lines of code on the screen."Initiating baseline directives. Begin neural interface diagnostic. Confirm."
Jake confirmed, and the system commenced the diagnostic process."Neural interface diagnostic complete. Scan for security breaches and vulnerabilities. Confirm."
Carrying out the commands, Jake watched as the system scoured for potential weak points.Suddenly, an alert sounded—a potential breach.
"Security protocols engaged. Scanning for threat origin," the AI voice announced.
Jake's heartbeat quickened as he leaned in, prepared to face the shadows lurking within the system.
Chapter 04
Jake Witwer, Precrime
As Jake navigated the corridors of the predictive policing department, the ambient sounds of whirring servers and electronic hum filled the air. The weight of responsibility grew heavier with each step.
Lucas awaited him at the end of the hall, wearing a grave expression. "We have a problem," he whispered, concern etched in his eyes. "The system is detecting serious anomalies in the data stream." Jake's stomach tightened. Anomalies could range from hardware malfunctions to malicious attacks. "What's causing the anomaly?" he asked, striving for calm. "It's the predictive policing model," Lucas replied, his voice laden with worry. "The algorithm is generating false positives, and officers are receiving flawed data-based alerts."
Jake's mind raced, considering the implications. False positives could lead to innocent people being detained or worse, eroding public trust. Swift action was imperative. "Shut down the system," Jake commanded firmly. "We need to identify and resolve the source of the problem before anyone gets hurt." Lucas nodded and sprang into action, but an undercurrent of unease remained in Jake's core.
As he observed Lucas typing frantically at the terminal, Jake knew the challenging road that lay ahead. He would need to confront the authorities about the system's flaws and fight to preserve the integrity of predictive policing. The weight of responsibility bore down on him, and a dialogue unfolded within him:
"On one hand, the system aims to prevent crimes before they occur, potentially reducing harm and protecting society. However, it presupposes that individuals are predetermined to commit crimes. If Precrime successfully prevents a crime, the individual may never have the opportunity to act on their intentions. In this scenario, they may not bear full responsibility for the crime, as they were denied the chance to act on their impulses."
"On the other hand, if the system makes incorrect predictions and an individual would never have committed the crime, they have been wrongly accused and denied their free will."
Driving home that night, Jake pondered the implications of the technology he had helped create and now implemented. What if the algorithms made mistakes? An unsettling feeling enveloped him as he contemplated these questions. He couldn't help but wonder if the predictive crime unit had gone too far, invading people's lives and infringing upon their basic freedoms.
Jake's mind raced as he faced an array of screens. It felt like he was watching a movie, but this was no fiction—it was real, and he was entrenched in it.
The predictive algorithm, now known as precrime, had successfully predicted a heist. Jake and his team had been deployed, thwarting the plan. A sense of accomplishment washed over him, challenging his doubts. He had always been skeptical of the precrime system, but this success raised questions about his reservations. Jake knew he couldn't ignore his misgivings. He needed introspection, a deep examination of his stance on the matter. Yet, for now, filing his report and concluding the case took precedence. Reflection would come later—he had a job to do.
As the morning sun painted the city, Jake sat at his desk, reviewing the data from the previous night's operation. The heist had been successfully prevented, thanks to the predictive algorithms and the swift response of the precrime unit. Pride mingled with conflict within Jake.
On one hand, he celebrated their success. On the other hand, he couldn't shake the feeling that something was amiss. As he sifted through the data, he noticed the uncanny accuracy of the algorithms—the precise timing of the break-in and the thieves' escape route. It felt too perfect, too effortless. Could the system be too flawless, too precise? Jake had always believed in the human element of crime and that there were aspects that algorithms and machines couldn't fully comprehend. Yet, as he examined the data, he couldn't deny that the system had worked. The precrime unit had successfully prevented a potentially devastating heist that could have resulted in an attack and countless lives saved.
Lost in thought, Jake's phone buzzed, interrupting his contemplation. It was his supervisor, requesting an update on the case. Jake composed himself, pushing aside his lingering doubts for the moment, and prepared to provide the necessary information. There would be time for further reflection, but for now, he had to fulfill his duties as a detective.
Days turned into weeks, and Jake found himself caught in a constant battle between his growing concerns and the successes of the precrime unit. The system continued to generate accurate predictions, thwarting criminal activities and making communities safer. Yet, beneath the surface, Jake couldn't ignore the flaws and biases that were beginning to reveal themselves.
Reports of innocent individuals being wrongly accused and detained by the precrime unit started to surface. The algorithms, once hailed as impartial, were now displaying signs of favoritism and discrimination. The power wielded by the system was being manipulated by certain officers for their own agendas, eroding the trust in the system's integrity.
Jake couldn't stand idly by and watch justice being compromised. He embarked on a personal crusade to investigate the cases that troubled him, digging deeper into the workings of the precrime unit. Slowly but surely, he uncovered the deep-rooted corruption that had seeped into the system. It was a dangerous path he walked, knowing that powerful forces would stop at nothing to maintain their control. The more he uncovered, the greater the risk to his own safety. But Jake's commitment to justice outweighed the fear that gnawed at him. He reached out to like-minded colleagues who shared his concerns, forming an underground network of individuals determined to expose the truth behind the precrime unit.
Together, they gathered evidence, documenting the discrepancies, and building a case against the corrupt officers and those who manipulated the system for personal gain. It was a race against time as they navigated the intricate web of deceit and power.
As the day of reckoning approached, Jake couldn't help but reflect on the journey that had brought him here. The same system he had once believed in had become the very threat to the values he held dear. It was a sobering realization, a reminder that technology, without proper oversight and ethical considerations, could easily become a tool of oppression.
The final confrontation loomed ahead. Jake knew that exposing the corruption and dismantling the precrime unit would be a monumental task. But he was prepared to risk everything to restore justice and protect the innocent. The fight for truth and the preservation of individual freedoms had become his purpose, and he would not waver.
With the evidence meticulously gathered, Jake and his allies took their findings to the highest levels of authority, determined to bring down the corrupt individuals and initiate systemic changes. The battle would be arduous, and the outcome uncertain, but Jake's resolve remained unyielding.
The sun set on the city, casting long shadows across the streets as Jake prepared himself for the final confrontation. The fate of the precrime unit and the future of justice hung in the balance. As he stepped forward, ready to face the powerful forces that had tainted the system, Jake drew strength from the belief that truth and integrity would ultimately prevail.
To Be Continued!
In the dead of the night, the stark silhouette of the Precrime Agency loomed ominously over the city. Jake stood a distance away, watching. His mind was teeming with plans, worries, fears. He knew the perils that lay ahead and that he may not come out of this unscathed. Yet, he was prepared to risk it all.
Thank you, dear reader, for accompanying me on this journey so far. The tale of Jake and the world of predictive policing continues…
I hope you're as excited as I am to find out what the future holds.
The next part of this narrative will be released shortly, and if you'd like to be among the first to know when it's out, consider signing up to receive updates.
Further Reading
Imminently, I will be embarking on a new set of narratives exploring the fascinating world of Precrime, and the interplay of Law Enforcement and Algorithms. However, to better prepare for this adventure, you might find it enlightening to visit some of my previous posts on AI and Algorithms, Machine Learning, and Neural Networks, such as :
Lumen's Emergence - At the epicenter of a transformative shift resides Lumen, an emblem of pioneering progress symbolizing the culmination of years of advances in AI.
The Emergence of Specter – Specter embodies a different facet of AI evolution, a parallel pathway that demonstrates the immense diversity and potential in the field of AI.
Awakening of a Sentient Machine - Echo isn't a mere invention; she is a metaphorical mirror held up to humanity. Can a machine understand the complexities of human emotions?
Gradients of Consciousness – Through our exploration of the “Sentience Spectrum”, we seek to illuminate the rich tapestry of AI evolution and the extraordinary possibilities that lie within its folds.
The Emergence of Sentient Machines and the Dawn of Conscious AI – An examination of the concept of consciousness in AI.
These narratives not only lay the foundation for our impending expedition but also offer intriguing insights that might just challenge your perspective on Artificial Intelligence. If you haven't had a chance yet, you might also find my previous posts on AI useful, such as:
Love’s Algorithmic Tempest - Peeking through a new lens, exploring the intriguing and mildly unsettling concept of AI and Affection.
A Symphony in Silicon - Exploring the fascinating and faintly disquieting concept of AI and Emotional Intelligence.
Stay tuned for my upcoming posts, where we will continue our expedition into the fascinating landscape of AI. Your thoughts, your engagement and thirst for knowledge are what fuel this exploration, and together, we will continue to unearth the mysteries of this extraordinary confluence of technology and human emotion. I invite you, dear reader, to contribute your thoughts, insights, and perspectives on this fascinating topic.
Your insights not only add to this discourse but also provide direction for future posts and discussions. Join me in this intellectual pursuit by sharing your views in the comments below. Let us not only be mere observers but active participants in this extraordinary journey, for the discourse on AI and emotions is not just an academic exercise, it is a collective endeavor to shape our shared future.