40 hours a week times 52 weeks is 2080 hours. Subtract a few weeks for vacations and holidays, and you get a little less that 2000 hours. So, basically, this is a little more than one programmer-year of effort if the estimate is in the right ballpark.
It's gross that the decision not to fix this carries an apparent implicit economic calculation that one programmer-year is more valuable than the freedom that is being denied to an unknown number of people whom society deems less important. (Granted the actual situation is more complicated and the state is constrained by their contract with the vendor, which we can reasonably guess is going to charge as much as they can contractually get away with rather than the programmer's actual salary cost.)
At least the Department of Corrections has assigned people to do the calculations manually. That's better, but it sounds like they just don't have enough people on it to keep up.
Previously, it seems like there was a single standard, applied universally: 1 day of earned release credit for every 6 days served. The new rules have many more inputs, with lots of caveats: only certain offenses are eligible, and the inmate can not have been convicted of some other types of offenses, and the inmate must have completed some specific courses, and the inmate can't have previously been convicted of certain felonies.
The 2k hours may very well be excessive, and I don't care if it takes 20k hours, it means they should mothball their software and do it manually if that's the case, but just calling it a "bug" is misleading IMO.
You'd have the private prisons and the prison guards union climbing up everyone's posteriors.
Spending money will remain economic decision until we can have government agencies fueled by the righteous indignation of their critics rather than having a line item added to their budget. Until you can convert that indignation into legal tender, agencies will remain subject to old fashioned accounting constraints.
Seems plausible it could balloon.
I’m surprised this doesn’t create a massive liability for the state.
That would basically result on patients not getting their ostomy bags on time, and I can't even imagine what would follow afterwards. What would be the reactions of patients and their relatives, what levels of stress would hospitals' employees would be subjected to, and so on.
I left the company some months after that, and I don't know what was the final decision, but they'd been warned.
Maybe one day some set of ethical standards will be considered non-functional requirements as important as robustness, security and others.
With technicians being responsible for warning their managers, managers being responsible for assessing risks and documenting their decisions, everything being made transparently and everybody being accountable.
That this problem is allowed to persist seems like an indication that the people in charge believe that prisoners have a low probability of successfully suing the state for damages.
My immediate reaction is that either (1) it is possible, and the story is therefore more nuanced that might appear at first glance, or (2) it is not possible, and this is an even more egregious problem.
That's just my assumption. Remeber prisoners tend to be from less privileged backgrounds and some may be very ignorant of how the law works or even functionally illiterate. So things that seem "obvious" to educated engineers may not be obvious to them.
More commonly though the people wouldn't even know to contact their lawyer, because they are credited for time served pre-conviction.
I think about that every time I read about another government (or private!) company that wastes tens or hundreds of million of dollars (or euros or pounds) on custom software.
It seems like there should be 1, 2, or 3 DMV programs. The same for building codes, tax codes, etc. And prison software. You can be more like Massachusetts or Mississippi or Montana (hypothetical examples) but pick one and harmonize with it.
1: compiled is the lowest of 3 standards that outside accountants can do; "reviewed" is higher and "audited" is the highest. Even at the compiled level they mailed out postcards to a certain number of customers asking if they were customers over the past year and had spent this much money. It was fairly easy for the acquiring company's outside accountants to review PWC's work and bring it up to audited standard.
How many unemployment systems, prisoner tracking systems, DMV systems do you need? These are common components across governments.
Example: Login.gov now supports local and state government partners. Your constituent IAM needs can now be met by a federal team that is efficient and competent, instead of every city and state reinventing the wheel (poorly and in expensively).
Other states might want to do the same, although the fees would probably differ. So the idea is that 10 or 15 states cluster around one solution for a department, 20 for another, 10 for a third and the rest go their own way. The states would have a lot of power in being able to replace working solution A with B or C. So there's 3 or 4 DMV vendors, there's 3 or 4 unemployment vendors, some for contact tracing (my state of Oregon still hasn't implemented the Google/Apple tracing), and so on.
The current situation is that you know a potential replacement will be late and over budget, you just don't know exactly how bad it will be. And Accenture and IBM like it that way and are very adept at persuading the decision makers that they're very special snowflakes and can't use an off-the-shelf solution.
I know somebody who audits municipalities. We did a graph that showed relations between different players. It’s basically just a big insider club of usually 20-40 people and families that give contracts to each other at the expense of the tax payer.
I think the problem is that unlike our more notable branches, we don't hire experts in the field. I don't mean they're incompetent at technology, but that a problem like this really exists at the intersection of government and technology. We keep hiring general-purpose contractors to build things like this, and then we're shocked when it falls apart in the environment governments exist in.
We need companies that specialize in this intersection. Companies that can keep public sentiment in mind and build an architecture that's flexible in the places where society is. It's the same way that most of us in general purpose IT try to build systems that can adapt to changes in the IT landscape. Put it in Docker so we can run it on a cloud, on bare metal, on k8s and probably on whatever's next. Governments struggle to pivot like that due to funding (how do you argue for funding for features since you can't earn revenue?), and because a lot of it is legislated out of their control. Learning to read the public sentiment is just like us reading trends in a newsletter.
In your cases you have items like: accounting, building codes, tax codes, automobile codes, etc.
While it makes sense to try and harmonize with the general policies, every state, every municipality, and every business is going to have special cases. Even software has edge cases for protocol behaviors.
What would be nicer, imho, is if all of these laws were written in domain specific languages that specify the law and then the software could just pick up the definitions signed into law. Lawyers as they are feel like a combination of legal interpreters, combined with a combination of being red/blue security team members depending on what they are doing.
are there popular languages for implementing these types of DSLs?
Kind of defeats the entire purpose of having states to start with.
If we want to make the US a centralized, unitary state, let's do that through the elected central government and not through deferral to IT contractors.
All of a sudden that person could no longer make calls for 30 days, and they did nothing wrong to get that.
If corrections staff were held personally liable for these failures, or the local jurisdiction faced steep financial penalties, it wouldn’t happen. No liability, no responsibility.
That is spot on, and generalizes well.
"iot vendors make post-sales money if they collect data from their device"
"phone vendors make money if they bundle terrible apps with their phone"
"robocallers make lots of money, with historically no fines paid out for violations"
They wouldn't even know the first thing about how to hire someone capable of doing this. They'd have to hire a consultant to hire another consultant.
Here's the relevant statute:
13-1303. Unlawful imprisonment; classification; definition
A. A person commits unlawful imprisonment by knowingly restraining another person.
B. In any prosecution for unlawful imprisonment, it is a defense that:
1. The restraint was accomplished by a peace officer or detention officer acting in good faith in the lawful performance of his duty; or
2. The defendant is a relative of the person restrained and the defendant's sole intent is to assume lawful custody of that person and the restraint was accomplished without physical injury.
C. Unlawful imprisonment is a class 6 felony unless the victim is released voluntarily by the defendant without physical injury in a safe place before arrest in which case it is a class 1 misdemeanor.
D. For the purposes of this section, "detention officer" means a person other than an elected official who is employed by a county, city or town and who is responsible for the supervision, protection, care, custody or control of inmates in a county or municipal correctional institution. Detention officer does not include counselors or secretarial, clerical or professionally trained personnel.
https://www.azleg.gov/ars/13/01303.htm
Assumption being that a detention officer is not acting in good faith if they have a list of people who should no longer be detained under state law.
> Assumption being that a detention officer is not acting in good faith if they have a list of people who should no longer be detained under state law.
I agree with your premise and assertion, but I'm not sure that's exactly what's happening here. I'd like to preface this by saying I absolutely believe there need to be ramifications; I'm just not sure that it fits "clearly defined false imprisonment." I think a category would have to be added to the false imprisonment statute for "negligence" for this to be considered false imprisonment and let me tell you why:
From what I can tell, this article is talking about a couple of massive issues but the wrongful imprisonment bit is about a specific bug (SB1310) in ACIS that can't calculate an updated release date for inmates that complete special programs that award additional release credits as per an amendment signed into law in 2019. Since they can't automatically update a release date for individuals that have completed this program, they keep track of it manually. To me, the article doesn't read like they have a list of people who should be released but aren't being released because the software says so; from my very limited perspective it reads like there are certain programs an inmate can complete to earn extra release credits and since the system can't track these extra credits, the detention officers do it manually. I would imagine their manual process goes something like this:
1) Compile list of inmates that have earned extra release credits through the aforementioned release programming.
2) Select inmate from list, possibly in order of original release date, earliest first.
3) Calculate the amount of release credits they received from completion of the programming.
4) Calculate the total hours those credits equal.
5) Deduct hours from release date.
6) Manually update the release date in ACIS (likely requiring warden and/or judicial approval, but idk).
6a) Since ACIS now has the appropriate release date, the inmate will be processed for release now (if the date has passed) or as they normally would be.
6b) Remove inmate's name from list unless currently enrolled in early release programming, in which case they are moved to the bottom of the queue.
7) Lather, rinse, repeat.
Being denied release because of a software error would be hellish for both an inmate and their loved ones... But because it doesn't seem like they have an actual list of people that should have already been released but haven't been because the software made a critical oversight, I don't think it fits the legislation as it exists today for false imprisonment. The tool is broken so they've switched to manual calculation until someone more important decides it's worth fixing.
If we add negligence to the false imprisonment statute, I'd agree wholeheartedly! But IA[very_much]NAL, so I'll confess I don't really know anything about anything.
EDIT: formatting
[1]: https://corrections.az.gov/sites/default/files/documents/PDF...
See also: employment security sites, cannabis track and trace, driving license, etc.
Some of these bugs cause direct financial harm to citizens and this one is much worse!
Show me the test cases! Show me the code!!
Not arguing against it. State secrets are needed in some instances. Just pointing out that if you exempt something, there’ll be people who’ll construe as much as they can under than exemption. Is there any solution to that?
I think it can be managed but it is a genuine concern nonetheless.
Granted, that doesn't make attack impossible, but it does make it very hard, especially when you disable all the USB ports and optical drives and socialize extreme consequences to any employees not following ITSEC rules.
Why? Because the spec for which the tests where written didn't include some contingency, for example with software that rigidly require certain steps to happen and doesn't provide a human-controlled override.
There was an Ask HN question the other day where the poster asked if the software we are building is making the world a better place. There were hardly any replies at all. Is this because for the most part our efforts in producing software are actually doing the opposite? It certainly seems that way reading articles like this.
> Instead of fixing the bug, department sources said employees are attempting to identify qualifying inmates manually... But sources say the department isn’t even scratching the surface of the entire number of eligible inmates. “The only prisoners that are getting into programming are the squeaky wheels,” a source said, “the ones who already know they qualify or people who have family members on the outside advocating for them.”
> In the meantime, Lamoreaux confirmed the “data is being calculated manually and then entered into the system.” Department sources said this means “someone is sitting there crunching numbers with a calculator and interpreting how each of the new laws that have been passed would impact an inmate.” “It makes me sick,” one source said, noting that even the most diligent employees are capable of making math errors that could result in additional months or years in prison for an inmate. “What the hell are we doing here? People’s lives are at stake.”
Comments like yours seem to glorify a pre-software world filled with manual entry. The reality is that manual entry is even more error-prone, bias-prone, with more people falling through the cracks.
If nothing else, software can be uniformly applied at a mass scale, and audited for any and all bugs. And faulty software can be exposed through leaks like the above, to expose and fix systemic problems. Whereas a world of manual entry simply ignores vast numbers of errors and biases which are extremely hard to detect/prove, and even then, can simply be scapegoated with some unlucky individuals, without any effort to fix systemically.
Instead, it's one where computers do calculations but don't make decisions; and then humans look at those calculations and have a final say (and responsibility!) over inputting a decision into the computer in response to the calculations the computer did, plus any other qualitative raw data factors that are human-legible but machine-illegible (e.g. the "special requests" field on your pizza order.)
Governments already know how to design human-computer systems this way; that knowledge is just not evenly distributed. This is, for example, how military drone software works: the robot computes a target lock and says "I can shoot that if you tell me to"; the human operator makes the decision of whether to grant authorization to shoot; the robot, with authorization, then computes when is best to shoot, and shoots at the optimal time (unless authorization is revoked before that happens.) A human operator somewhere nevertheless bears final responsibility for each shot fired. The human is in command of the software, just as they would be in command of a platoon of infantrymen.
You know policy/mechanism separation? For bureaucratic processes, mechanism is generally fine to automate 100%. But, at the point where policy is computed, you can gain a lot by ensuring that the computed policy goes through a final predicate-function workflow-step defined as "show a human my work and my proposed decision, and then return their decision."
I think that the pre-software world was quite bias-prone and extremely expensive for large processing jobs like this. The question is how this system was allowed to transition from the expensive manually managed system that used to be in place to the automatic software driven system that is replacing it at such a cut-rate that gigantic bugs were allowed to sneak in.
It appears this software is primarily used by the state government so why was such a poor replacement allowed as a substitute for the working manual process.
Also, the number of bugs this software has accumulated since Nov 2019 (14000) is astounding enough that I assume it's counting incidents - that's a fair way to go since these are folks' lives, but I'd be curious to know just how bug laden this software actually is.
Although there is another factor here - this specific release program was a rather late feature addition that may not have been covered in the original contract with ACIS since the bill was only signed into law two months before the software was rolled out.
It doesn’t have to be. But when it’s subjected to the same incentives that produced this software and perpetuated its broken state, we should expect the result to be much the same.
When you pull back and try to look at it with fresh eyes, our prison system is abjectly terrifying. It’s designed to funnel wealth to private entities, not to implement justice or rehabilitate criminals or whatever other worthy goal(s) you might imagine for it. This story (as horrifying as it is just by itself) is only one little corner of the monolithic perversity of the system as a whole, and the executive powers involved in steering that system are about as close to evil as you can find in the real world.
The whole thing needs to be torn down and rebuilt. As long as it exists, it puts the lie to our claim of being a society that values freedom and justice.
Circling back, I guess the point is that the ideas about how to do software in your last paragraph have no chance of being implemented in the system as it currently exists. To fix “systemic problems”, we will have to aim a lot higher with a much bigger gun.
We can no longer afford to partition the people who understand/use business logic from the people who turn it into code and maintain that code. Period. It's ridiculous and endemic at this point. This problem permeates virtually every large organization in existence; public or private.
It's partly an issue of education, partly an issue of organizational structuring, and partly an issue of accessibility of technologies. But the sum of these parts has become entirely unacceptable in the year 2021.
From a professional who works with data systems, you're more likely to have a database with bad data in it that not.
For every piece of software that can directly and materially harm someone's life like this, there should be a chain of responsibility. And within that chain, there should be legal recourse and, in most cases, penal consequences, especially in the case of inadequate software quality/testing/validation, should the software fail to perform its task correctly. Bonus side effect, software quality will go up across the board in the industry.
While I do agree that making software better/more reliable is a good goal, I believe we would be better off making the system as a whole more robust; the system that includes humans. For every situation where a piece of software has control of something that effects society (individual, group, etc), there should always be a clear and direct means of appealing / pushing back on the decision that was made. Those means should involve a human reviewing the information and making a decision based on that information, not on what the computer said. There's thread after thread of us saying the exact same thing about companies like Google and Facebook; it should apply as a general rule.
Or at least a huge share of that burden needs to be on the client so that they define and then test and control the SW they receive properly.
The problems with the software sound like typical big software project problems. Trying to cover a huge breadth of use cases with lots of very important tiny details and released in a big bang (one migration). It sounds like more of a project mgmt problem than a software problem to me.
But maybe I am just a hammer and see nails everywhere.
More people working with a gun to their head. I'd rather the gun be pointed at the person who already has a gun pointed at me, instead of both barrels facing in my direction.
If I'm (or my company is) personally on the hook for bugs, then I'm going to adopt a NASA-like software quality regimen, pushing up the cost of the product.
Every single part of the software stack below me, from hardware, OS, compiler toolchain, disavows responsibility so if I have to absorb all the risk, the product is going to be mind bogglingly expensive.
Because their constituents want people to be punished and if the inmates have to suffer a little extra so be it, "they shouldn't have committed a crime."
Our society is severely lacking in empathy.
No, you know how to blame people and punish people, but that doesn't mean you know how to deliver custom bespoke software for a price that the various government agencies can afford which doesn't have bugs that severely hurt peoples lives.
In fact, punishing people is not going to accomplish that.
That's the problem with a legislature that thinks it can pass any law it wants - let's take into account this new variable X that our software has no way of collecting or measuring - without looking at the feasibility of actually implementing the law given the infrastructure available, and without approving a corresponding budget for software upgrades to actually enact the law, and taking into account how much time it would take to write, test, deploy, and then train people to use the new software instead of just issuing streams of mandates like Emperor Norton and expecting the mandates to materialize into existence like the morning dew. And if said morning dew does not appear, then we can punish and sue the people in charge when they tell us there is no way they can do what we are asking them.
Of course there is blame on the prison leadership for covering things up and that leadership should be fired, but you can punish and sue people all day long and it's not going to result in any good code being written. Punish enough people, and it will just result in the Law being repealed.
The problem with this type of bespoke code is that it has exactly 1 customer, so it's going to be horrendously expensive while also being buggy and quickly thrown together compared to software whose development costs are leveraged over millions of customers. And then what happens next year when some crusader decides that they need to take some other new variable into account? Constantly changing requirements, underspecified projects, one-off projects whose schedules are impossible to estimate, and cash strapped local governments. Yeah, that's a recipe for success.
This is why everyone hates enterprise software, but even enterprise software has tens of thousands of customers. Bespoke software for the Arizona prison system -- forget it.
Wikipedia says, "Under common law, false imprisonment is both a crime and a tort".
And don't tell me you can't buy two CRUD applications for 24 million dollars. It's a silly amount of money for such a buggy application.
They aren't cowardly; they are responding rationally to a constituency that hates "criminals". Prioritizing fixing discriminatory systems (such as this software, or "stop and frisk", or the death penalty) is bad electoral politics for "tough on crime" politicians.
I find the government "requirements" process tends to create situations like this. Rather than build flexible software that puts some degree of trust in the person using it, they tend to overspecify the current bureaucratic process. In many cases, the person pushing for the software is looking to use software to enforce bureaucratic control that they have been unable to otherwise exercise, with the effect of the people the project initiator wants to use the software simply working around it. They then institute all sorts of punishments and controls to insure it must be used. This then results in the kind of insane situation we have here, where you can't do something perfectly legal because "computer says no".
This is frequently my observation as well. In the process of creating stricter control the bureaucrat increases the the power of their bureaucracy while shifting the blame for any problems to a faceless entity.
They then institute all sorts of punishments and controls to insure it must be used.
This leads me to one of my primary frustrations with the bureaucratization of our lives. Severe consequences are attached to low stakes situations and rational individuals who see the harm caused by the situation are rendered powerless to make changes.
You are attacking the wrong target. It's the government that's broken. This kind of outrage can happen just as easily with pencil and paper. The root cause is the lack of accountability and desire to make the government function better.
I'll note that this isn't the first time that people have said "well its the algorithm" when they were responsible. The example that springs to mind is bail risk assessments. You're very correct in that there are people making real decisions that are very cruel here. The machines give them something to hide behind.
The fact people are not asking that is worrying. I understand why the system was not designed to do something that happened later (even if it could have been reasonably foreseen) but the fact that it was implemented with no override is really the scandal.
I don't know whether this comes down to an amount of power that exists in a Governor that means the rest of the organisation can't say, "sorry Guv, but we can't do this because the software wasn't written to". If TV is to be believed, Governors want things done yesterday and you worry about the problems.
This right here is the difference between conventional engineering disciplines where designs require a Stamp from an Engineer of Record who takes on personal responsibility in the event of design failures vs. the current discipline of software engineering.
There's a big difference between a software developer and a software engineer, and I think that difference should be codified with a licensure and a stamp like it is in every other engineering field in the states.
Software like this ought to require a stamp.
A decent analogy is the environmental work I've done. When we come up with solutions and mitigations to environmental problems, like software, we can't always predict the result because of the complexities involved. So we stamp a design, but we, or the agencies responsible for allowing the project often specify additional monitoring or other stipulations with very specific performance guidelines. It's a flexible system and possible to adapt to, but there are real consequences and fines when targets aren't met. When bad things happen, the specifics of what went wrong and why are very relevant and the engineer may be to blame, or the owner/site manager, or the contractor who did the work, or sometimes no one is to be blamed but the agencies are able to say: "Hey this isn't working and needs to be addressed, do it by this date or else."
In engineering, there's an enormous amount of public trust given to engineered designs. The engineer takes personal responsibility for that public trust that a building or bridge isn't going to fall down. And if you're negligent, it's a BFD.
Given the current level of public trust that we are putting into software systems, it's crazy to me that we haven't adopted a similar system.
This is why penalties are such an important part of the feedback loop. Obviously we can't go back in time and restore someone's phone privileges, but we can award monetary damages for the mistake.
Monetary damages alone won't discourage this behavior, though, as ultimately taxpayers foot the bill. There also must be some degree of accountability for those in charge of the system. Software can't become a tool for dodging accountability. Those in charge of implementing the software, providing the inputs, and managing the outputs must be held accountable for related mistakes.
> There was an Ask HN question the other day where the poster asked if the software we are building is making the world a better place. There were hardly any replies at all.
Few Ask HN questions get many responses. This is also a loaded question, as HN is notorious for nit-picking every response and putting too much emphasis on the downsides. For example, I know farmers who have increased their farm productivity massively using modern hardware and software. However, if I posted that it would inevitably draw concerns about replacing human jobs, right-to-repair issues, and other issues surrounding the space. The world is definitely better off for having more efficient and productive farming techniques, freeing most of us up to do things other than farm.
However, all new advances bring a different set of problems. Instead of trying to force everything into broad categories of better or worse I think it's important to acknowledge that technology makes the world different. Different is a combination of better and worse. The modern world has different problems than we did 100 years ago, but given the choice I wouldn't choose to roll back to the pre-computer era.
> It certainly seems that way reading articles like this.
Both news and social media have a strong bias toward articles that spark anger or outrage. For me, the whole world stops feeling like a dumpster fire when I disconnect from news and social media for a while. I'm looking forward to the post-COVID era where we can get back to interacting with each other in person rather than gathering around a constant stream of negative stories on social media.
Absolutely, and I agree that disconnecting can have positive benefits. On the other hand, at least for me personally, covid has disrupted the mechanisms that normally prevent in depth observation. It has given me time to read books I normally would not have read because that time went to things like waiting for my car to warm up so I can get to work on time, commuting, going out to lunch with co-workers, and going out for drinks with co-workers, friends, and family.
What is described in the article is outrageous. My concerns about bureaucracy and software's role in enabling it, on the other hand, have developed separately because I have the time to consider it.
You're a software developer maintaining an eCommerce platform, on the one hand your platform helps perpetuate low margin and wasteful consumerism, on the other hand your software enables small businesses to compete in the new online world.
Consumerism is bad, but commerce is as old as civilization and supports all of our lifestyles, so on a macro level you're in a tough spot. You're a talented developer putting their skills to work building something the community needs, I personally think that means you're doing good work in the context of your society, but it is difficult to say if it's making the world a better place.
Social media is the same. On the one hand, it connects family and friends, on the other it drives narcissisms, consumerism and misinformation.
You almost have to try and calculate the "Net Good" or "Net Bad" of a type of software and see how the cards fall. For social media I would suggest that it's currently in a "Net Bad" situation, causing more harm than good for example.
All government software should be open source and anyone should be able to investigate the code and submit bug reports, including inmates. If they know there is something wrong, they have a lot of time on their hands to learn a useful skill to fix these issues.
The government should then not be allowed to close a bug as wontfix or invalid without approval from other citizen watchdogs verifying if a bug report is legitimate.
For a large segment of the US electorate, anything that inflicts pain on "bad people" is "making the world a better place".
If the software was causing prisoners to be released early, most US voters would be up in arms. But if they're being held too long, the calculus is different. In software terms, for many Americans, a "tough on crime" outcome is a "feature not a bug".
But as the complexity goes up and the number of these complex situations increases, are we reaching a point where we outstrip the amount of money, talent and experience our institutions would need to deliver solutions to successfully manage them?
With our resources and intelligence as a species being capped, it seems at some point this is inevitable.
Software does not have its own will. Software is only allowed to make decisions on our behalf because we let it do so.
I do agree that software has no will. It is a tool for facilitating our will for better or worse.
You can see in the film Brazil, from 35 years ago, that this was already a problem and concern even without modern software.
I think the most likely explanation is just that people didn't see the question or weren't interested in having the discussion. Most people believe the work they're doing is at worst neutral. A less likely candidate for the reason (but still more likely than your guess) is that people didn't want to be subjected to unfounded criticism of their work from people who don't know anything about it.
The second and third order consequences is that developers will insulate themselves behind licensing and proofs of practice like every other industry.
Until people actually advocate for real penalties for such harmful violations they don’t care. All their temporary whining and crying is just blowing smoke up our asses.
No, it's now all about "extracting value", "rent seeking", "subscriptions", "censorship", "monopoly" and "control". We got bribed by FAANG and this is the consequence.
It would be hard to see this in e.g. Scandinavian countries, where incarceration is seen as rehabilitative rather than punitive.
In the US, racial discrimination, free market extremism along with "tough on crime" laws have created unimaginably cruel systems; together with private prisons, the goal has been on cutting costs rather than rehabilitating prisoners. Software is just a tool to further that goal.
I brought up the Ask HN question mostly because I felt the lack of replies were a silent acknowledgement of the realities of most software endeavors. That they are not making the world a better place. Most aren't going out of their way to make it worse. Probably, it isn't even a consideration.
Even if ideas like "the medium is the message" are partially true and then just partially applicable, that should give us pause when we try to cross out tools in our morality equations.
Here's a thought: Why do we permit private companies to not hire ex-cons? Why do you just get to decide that you don't want to hold up your civic responsibilities like that? Who wants to work with someone that used to be violent maniac, sleazy thief, or worse?
I agree about cost cutting measures and the criminal justice industrial complex. Still we have bigger issues around crime and reconciliation that prevent us from making progress. To be honest, I have trouble understanding how we're going to change, unless the average person can live with someone ruining their life, then spending "only" a year or so in prison and moving on to be successful in a decent paying job.
We still find that outrageous in the US, and it's going to be very tough to make progress that way. It's not about making something "a goal", especially in a country like the US, it's about convincing the wealthy and powerful class to do anything at all about it and stop making it worse.
https://www.youtube.com/watch?v=wzFmPFLIH5s
Highly underrated movie, with ever more contemporary relevance.
If it costs the prison 10x normal costs to do calculations by hand.. well, that's the cost of business.
If that description is accurate, that doesn't meet the definition of a "software bug", if the software was produced before that law was passed, and not updated since.
The bug is in the process of not having a plan for updating the software in a timely way when laws change, and not having a requirement in place for overriding the calculations in the interim.
What if an inmate suddenly receives a pardon?
https://media.kjzz.org/s3fs-public/styles/special_story_imag...
My wife had a citation that affected our liberties. The cop even knew that he didn't have probable cause but let the charge stand for more than a month. Nobody in the system cares. The magistrates and judges don't care, even though the new charge should be dismissed with prejudice over this and other rights violations. The supervisors and IA for the state police don't care and even cover some of the stuff up. The DA's office doesn't care either.
IT'S ALL ABOUT THE MONEY
FTA - "“Currently this calculation is not in ACIS at all,” the report states. “ACIS can calculate 1 earned credit for every 6 days served, but this is a new calculation.”"
tldr; a new law was passed that allowed for a different credit schedule for days served, and the system hasn't been updated to make that calculation.
Of course, if there's money to be made in having a change-resistant system, well that's a different story. YAGNIAYWPTTNFI (You ARE gonna need it, and you will pay through the nose for it) isn't quite as catchy though
This doesn’t violate YAGNI.
1. You’d have to know in advance what the scope of rule changes would be in order to implement the configuration system.
Human laws do not fit this constraint.
2. You’d also need a way to prove that the configuration system itself was sound.
3. You’d need a way to test configurations to make sure they executed as expected.
That is likely to be no better than just updating the codebase as requirements change, and there are many ways it could increase the cost.
PrisonChain©
Also, proof that things that are newer aren't better.
> “We have a couple modules they spent millions of dollars on that we can’t use at all,” a department source said.
> The ACIS software system replaced an older program called AIMS that had been in operation for more than three decades.
To add a slightly contrarian perspective, this reminds me of what I think was an old HN discussion about payroll systems.
One of the comments was along the lines of:
BEGIN QUOTE
You think it's "just payroll"? What about the following:
- The person who gets paid at 6/11ths of wage because there is a split contract
- Or the guy who gets three pay checks because he retired so gets a pension, is a contractor and is working for two different departments?
- etc
END QUOTE
The point being, these things sometimes seem simple but can get astoundingly complex very quickly.
Case in point: the system is designed to track gang affiliation, personal property and health issues.
What about the guy who is in a gang now but was in a different gang last year? And his stuff is in another block because his block's storage area is full? And he can't take the top bunk because he has gout(or diabetes etc) and can't climb anything?
And what if even with a perfect gang tracking system, you don't have enough "bins" to actually separate everyone who should be separated? How do you account for that in the system? Where do you track the "gang pairing priority list"?
And I know what some folks will say: "Yeah, that's just a rules engine that you keep separate from the code. Easy!" Ok. So how do you version control the rules engine? Do you build a dryrun option to see what a small tweak to the rules do? In my experience, probably not.
Long story short, jails are incredibly complex environments with a multitude of complex dimensions to account for. If you are curious about this, I highly recommend reading "Jailhouse Doc" [0]. It's an incredibly insightful look into JUST the medical side of prisons.
[0] - https://amzn.to/2ZJnVZG
Having a lot of unique edge cases is not really an excuse to not even attempt an adequate solution. Everything is manageable if you have a high enough view of everything.
> Do you build a dryrun option to see what a small tweak to the rules do? In my experience, probably not.
In MRP systems this is done regularly. I'm sure it's done regularly with most logistics software as well. Scheduling is an inherently difficult problem in computer science not unlike the traveling salesmen problem. There are algorithms you can employ to reduce your workload in solving the problem, but in high stakes systems even the best algorithms are going to employ at least one validation run if not several brute force attempts. In MRP systems most software has features to enable multiple schedules. So you have a production schedule, and a playground environment for finding your best opportunity cost.
> Long story short, jails are incredibly complex environments with a multitude of complex dimensions to account for. If you are curious about this, I highly recommend reading "Jailhouse Doc" [0]. It's an incredibly insightful look into JUST the medical side of prisons.
Bill Gates once said he liked to find lazy engineers to solve complex problems because they usually find a simple solution. In my experience, if the solution to a problem isn't programmatically obvious, you need to stop thinking in tangible items (or in this case, people) and start thinking in object oriented terms. For example, it is usually possible to turn a one-and-done conditional statement into a loop. There is a way to scale just about any program, you just need to view the problem from a higher altitude.
Imprison fewer people so that human review of life-critical software applications is faster and less costly.
Suddenly there is an incentive to create a verifiable and correct system on the part of the prison-industrial complex itself.
Any "third party" that doesn't belong to the prison-industrial complex becomes part of it if they get involved with this.
I haven't been there in a while, but this is how it was when I lived in Moscow.
I was just thinking that the executives at the prison, are clueless about software, and likely couldn't hire a competent consultancy agency, even if they wanted to.
And also, that the prison execs anyway are generally ok with bugs like this one — which keeps people longer in prison. I presume the prisons make more money, with more prisoners. (Although now some prison staff need to do extra manual work, but someone wrote elsewhere that that was only for squeaky wheels prisoners, i.e. those who knew enough to complain.)
And maybe the consultancy agency prefers to build software that actually never starts working completely, so they can continue billing the prisons forever?
It's a bit as if the prisons, and the consultancy agency, cooperated with each other, both of them making money, by exploiting the defenseless prisoners?
Was what I was thinking.
What are your thoughts?
The numbers I've been able to find suggest that Arizona's overall state prison population is well below capacity, suggesting that it is likely that the private prisons are operating in the fixed price range of their contracts.
If that is the case the prison company make more money when a prisoner is released than they do when the prisoner is retained.
Thanks for that, actually
However, by that same turn, it's been proven in court [3] that the state is on the hook for the minimum capacity % even when that capacity isn't met, and these contracts are fairly long (10 years + 2 renewable 5 year terms [5]). If the state is on the hook for up to 20 years of paying 100% to use 90% of something, there's a clear incentive to make a show of it being used.
So while there's no direct way for a private prison to fudge release date calculation, there's definitely no incentive for it to press the issue with the Department of Corrections considering the DoC could renegotiate the contract the following year [6]. As for the people in state office, it does not look or feel good to squander millions on an overestimation.
I'm not saying corruption is involved- all I'm saying is that there's certainly motive and opportunity for negligence.
Some sources for anyone interested:
[1] http://www.aublr.org/2017/11/private-prison-contracts-minimu... (look for source #14)
[2] A.R.S. 41-1609.01#P https://law.justia.com/codes/arizona/2011/title41/section41-...
[3] MTC vs ADC https://www.prisonlegalnews.org/news/2015/jul/31/report-find...
[4] https://kjzz.org/content/1647948/despite-declining-populatio...
[5] A.R.S. 41-1609.01#I,J https://law.justia.com/codes/arizona/2011/title41/section41-...
[6] A.R.S. 41-1609.01#C,D
Nope. It beeped like normal and the guard moved on to the next inmate in line. What was I supposed to do---tell the guards that I was supposed to be released? Riiiiight. It took the actions of a guard I had befriended early on to see me later and say "What the fuck are you still doing here?" to get me out of there.
Also, money can be given back, time can't.
language as the key as usual, it builds mental model different from reality and sets the discussion context obscuring the real issue and already skewed toward the angle the speaker wants - "software bug keeping". It isn't software who keeps the inmates, it is the people employed in that branch of government, and ultimately it is "we, the people". Blaming "computer" is as old an excuse as the pyramids as we still fall for it. Even more today i think.
So the prison administration know there are people being held that shouldn't be held, but they are still keeping them behind bars.
This is not a software problem.
But now I've learned that that could, in fact, be a potential problem in the future...
If so, you could somewhat automate this, or at least allow incarcerated people to cross check their own situation by providing their data.
If you can automate this, you could allow extensive lawsuits to pressure the penal system to get this disgusting problem solved.
It's been about 470 days since then. It means 29 bugs per day. At least they have in place an impressive process to report and manage bugs. Or is it 14,000 times the same bug?
>“It was Thanksgiving weekend,” one source recalled. “We were killing ourselves working on it, but every person associated with the software rollout begged (Deputy Director) Profiri not to go live.”
>But multiple sources involved in the rollout said they were instructed by department leadership to “not say a word” about their concerns. “We were told ‘We’re too deep into it — too much money had been spent — we can’t go back now.’”
"Prisoners released early by software bug (2015)" https://www.bbc.com/news/technology-35167191
There should be a class action lawsuit filed and those who let it slide should be responsible, including compensation for those held beyond their sentence / early release. Yes it's easy to say and there are few champions for prisoners in our society. But it is how we fix these types of issues (i.e., petitioning for harm in court), regardless of the origin (software or otherwise).
Perhaps there should be more class-action lawsuits on behalf of convicted prisoners in general.
Geez..
Bugs are when software does something it isn't supposed to, or doesn't do something it is supposed to. In this case, it's doing exactly what it was intended to do when it was implemented and put into service. Since then things have changed, so the vendor needs to implement the feature request, not "fix the bug", but this takes time.
> They estimated fixing the SB1310 bug would take roughly 2,000 additional programming hours.
wtf?
Guessing they estimated 1 person-year, but that's absurdly high.
If the government wants to change the logo on the login screen, they're gonna pay 1 person-year. Why would I quote less? It'll take over 1 person-year for another contractor to pick up and start supporting the codebase, and we won't support a codebase that another contractor has touched.
It's just the nature of government IT. Lowball the initial quote, then charge massively for any modifications and support now they're locked in.
It's government work, which automatically means access issues, dozens (or hundreds) of stakeholders requesting meetings, internal politicking, back-and-forth over change requests, etc. All of that costs real money and if you don't plan for it or make contingencies, you're going to be screwed.
These issues are not just limited to government either; any sufficiently large entity will have these same problems. So yeah, one person-year is a reasonable minimum viable contract period unless you have a process in place to fast-track RFP approval.
but I'm not sure how's the reality.
Software needs to be updated and maintained and you never know when you start writing it what the real requirements are. If I were a taxpayer in this state I’d be angry that my money would be going to a series of middlemen (e.g. a procurement consultant, program manager on the gov side, a contractor manager, the contracting company’s cut) rather than some state employed software developer.
I also find the government "requirements" process tends to create situations like this. Rather than build flexible software that puts some degree of trust in the person using it, they tend to overspecify the current bureaucratic process. In many cases, the person pushing for the software is looking to use software to enforce bureaucratic control that they have been unable to otherwise exercise, with the effect of the people the project initiator wants to use the software simply working around it. They then institute all sorts of punishments and controls to insure it must be used. This then results in the kind of insane situation we have here, where you can't do something perfectly legal because "computer says no".
Sure, it sounds like a public-sector employee gives better value but if you want to produce commercial quality then you still need management, consultancy and high quality HR with competitive salaries, otherwise you usually get mediocre developers with mediocre results.
Sometimes it really is better to pay a premium on the "day rate" to get something more quickly and to a higher standard. You also often have access to better support and maintenance.
You pays your money and takes your choice.
Are you willing to pay market rates to retain that developer? Or deal with the constant churn as people stay long enough to pad their resume before moving along to the next job?
I mean things probably seem to the outside like “it’s just two lines of code”, but if you’re messing with the business logic of releasing prisoners I’m sure you’d want to document how things currently work pretty thoroughly (inferred requirements), test/validate, have customer sign off on the existing functionality, make your change, re-run all the testing, requirements verification, etc. There’s probably a business analyst involved, a programmer, maybe even dedicated test manager, devops (do they have an existing dev/test system?). People in the prison bureau probably take their jobs seriously on paper, don’t want this change to cause more breakage than it fixes, and expressed that in the RFP.
If you have a choice between the regional furniture store and your state government, I'd have a hard time advising you on which one will suck less. It's a tossup. If the owners of the furniture store are old enough to have heirs who are late teens to 30-something, take the furniture store, because you can ask them to intervene.
At some point you have to pay the bills. And destress your employees who are constantly having to hurry up and wait.
I'd also be careful with the term "programming hours". I'm not sure how the news article got that or who said that initially, but it seems like a misrepresentation of the type of work needed. That estimate almost certainly includes everything involved in getting the code to production. You can imagine that means a lot of QA, red tape and holding the code's hand through environments.
If you got the cheapest possible software up-front that (barely, technically) met the original requirements, you did it by hiring the least skilled people that could barely pull off exactly what would get the contract paid, and asking them to rush and hurry and pile up technical debt.
So.
In this case, the software perhaps shouldn't have even been considered fulfilling the contract in the first place, that's how crappy it was. The crappier the software, the more expensive changes to it are, we all know that.
> One department whistleblower said the number of problems with the ACIS system was unprecedented in their professional experience. “I have never in my life run across an application like this,” they said. “It’s just been one big cluster.”
I also don't understand why they can't identify these people and release them by other means...
This is partly a failure of education, partly a failure of organizational structuring, and partly a failure of software accessibility. But it's a gargantuan failure all the same.
It might be good to have a non-terminological discussion now. Though this subthread isn't too bad.
I am a programmer myself. The shit you can get away with by claiming a software error is ridiculous and quite frankly dangerous. A bureacracies dream is being able to blame every bad outcome on a software error. We all know software errors are like a higher force, divine intervention, natural catastrophes — nothing can be done about them.
It is time we start taking our profession seriously, own our mistakes and collectively raise the stakes foe our errors.
I believe holding a person against their will is a criminal act - seems like most of the employees of the Arizona correctional facilities are now guilty of crimes worse than the majority of their inmates.
As a former prison inmate, I honestly think that most of the people affected will be quite happy with the monetary compensation that they'll receive once the courts get done with this.
> seems like most of the employees of the Arizona correctional facilities are now guilty of crimes worse than the majority of their inmates.
This is destructive thinking. There are surely some employees without relevant authority who are thinking "screw those animals", but there are surely other employees, also without relevant authority, who are sympathetic to the affected prisoners and want to see their cause prevail. None of the employees who work on the actual prisons have any responsibility for this. Releasing a felon is a huge deal and can only be done when ordered. Only the top people at the DOC, at their headquarters, have any authority that could possibly help in this situation.
About the "crimes worse than the majority of their inmates" part: If you compare to the crimes inmates were actually convicted of, perhaps they don't look so bad. But quite a few inmates have claimed to me to be guilty of crimes much worse than what they were convicted of. People get caught on the drug charges, but often get away with the violence. What percentage of rape in the hood do you think gets reported to police? 1%? Less?
I hope every one of the prisoners held past their release date sues and wins.
Even laws like kidnapping have a motive portion, and basically an incompetent cop or other official can do what they and just claim they sucked at their job so the motive portion can't be met.
Fixing such an issue should be top priority, and in the meantime, you would revert to pen and paper accounting to ensure the correct release of each prisoner.
Expect delays, such is the nature of legacy software.
So, some French brogrammers are preventing US citizens from being released from prison due to incompetence.