Transparency And Accountability In Government Forensic Science

Sumana Harihareswara next to public hearing noticeIn February, I learned that the New York State Assembly was planning a public hearing on government oversight of forensic science laboratories, and then was invited to offer ten minutes of testimony and then answer legislators' questions. This was a hearing held jointly by the Assembly Standing Committees on Codes, on Judiciary, and on Oversight, Analysis and Investigation and it was my first time speaking in this sort of capacity. I spoke on the importance of auditability and transparency in software used in devices the government uses in laboratories and field tests, and open source as an approach to improve these. And I testified to the efficiency, cost savings, security, and quality gains available by using open source software and by reusing and sharing open source software with other state governments. Here's a PDF of my testimony as written, and video and audio recordings are available as is a transcript that includes answers to the legislators' questions. It is a thrilling feeling to see my own words in a government hearing transcript, in that typeface and with those line numbers!

As I was researching my testimony, I got a lot of help from friends who introduced me to people who work in forensics or in this corner of the law. And I found an article by lawyer Rebecca Wexler on the danger of closed-source, unauditable code used in forensic science in the criminal justice system, and got the committee to also invite her to testify. Her testimony's also available in the recordings and transcript I link to above. And today she has a New York Times piece, "How Computers Are Harming Criminal Justice", which includes specific prescriptions:

Defense advocacy is a keystone of due process, not a business competition. And defense attorneys are officers of the court, not would-be thieves. In civil cases, trade secrets are often disclosed to opposing parties subject to a protective order. The same solution should work for those defending life or liberty.

The Supreme Court is currently considering hearing a case, Wisconsin v. Loomis, that raises similar issues. If it hears the case, the court will have the opportunity to rule on whether it violates due process to sentence someone based on a risk-assessment instrument whose workings are protected as a trade secret. If the court declines the case or rules that this is constitutional, legislatures should step in and pass laws limiting trade-secret safeguards in criminal proceedings to a protective order and nothing more.

I'll add here something I said during the questions-and-answers with the legislators:

And talking about the need for source code review here, I'm going to speak here as a programmer and a manager. Every piece of software that's ever been written that's longer than just a couple of lines long, that actually does anything substantive, has bugs. It has defects. And if you want to write code that doesn't have defects or if you want to at least have an understanding of what the defects are so that you can manage them, so that you can oversight them (the same way that we have a system of democracy, right, of course there's going to be problems, but we have mechanisms of oversight) -- If in a system that's going to have defects, if we don't have any oversight, if we have no transparency into what those instructions are doing and to what the recipe is, not only are we guaranteed to have bugs; we're guaranteed to have bugs that are harder to track down. And given what we've heard earlier about the fact that it's very likely that in some of these cases there will be discriminatory impacts, I think it's even more important; this isn't just going to be random.

I'll give you an example. HP, the computer manufacturer, they made a web camera, a camera built into a computer or a laptop that was supposed to automatically detect when there was a face. It didn't see black people's faces because they hadn't been tested on people with darker skin tones. Now at least that was somewhat easy to detect once it actually got out into the marketplace and HP had to absorb some laughter. But nobody's life was at stake, right?

When you're doing forensic work, of course in a state the size of New York State, edge cases, things that'll only happen under this combination of combination of conditions are going to happen every Tuesday, aren't they? And the way that the new generation of probabilistic DNA genotyping and other more complex bits of software work, it's not just: Okay, now much of fluid X is in sample Y? It's running a zillion different simulations based on different ideas of how the world could be. Maybe you've heard like the butterfly effect? If one little thing is off, you know, we might get a hurricane.

Transparency And Accountability In Government Forensic Science

Sumana Harihareswara next to public hearing noticeIn February, I learned that the New York State Assembly was planning a public hearing on government oversight of forensic science laboratories, and then was invited to offer ten minutes of testimony and then answer legislators' questions. This was a hearing held jointly by the Assembly Standing Committees on Codes, on Judiciary, and on Oversight, Analysis and Investigation and it was my first time speaking in this sort of capacity. I spoke on the importance of auditability and transparency in software used in devices the government uses in laboratories and field tests, and open source as an approach to improve these. And I testified to the efficiency, cost savings, security, and quality gains available by using open source software and by reusing and sharing open source software with other state governments. Here's a PDF of my testimony as written, and video and audio recordings are available as is a transcript that includes answers to the legislators' questions. It is a thrilling feeling to see my own words in a government hearing transcript, in that typeface and with those line numbers!

As I was researching my testimony, I got a lot of help from friends who introduced me to people who work in forensics or in this corner of the law. And I found an article by lawyer Rebecca Wexler on the danger of closed-source, unauditable code used in forensic science in the criminal justice system, and got the committee to also invite her to testify. Her testimony's also available in the recordings and transcript I link to above. And today she has a New York Times piece, "How Computers Are Harming Criminal Justice", which includes specific prescriptions:

Defense advocacy is a keystone of due process, not a business competition. And defense attorneys are officers of the court, not would-be thieves. In civil cases, trade secrets are often disclosed to opposing parties subject to a protective order. The same solution should work for those defending life or liberty.

The Supreme Court is currently considering hearing a case, Wisconsin v. Loomis, that raises similar issues. If it hears the case, the court will have the opportunity to rule on whether it violates due process to sentence someone based on a risk-assessment instrument whose workings are protected as a trade secret. If the court declines the case or rules that this is constitutional, legislatures should step in and pass laws limiting trade-secret safeguards in criminal proceedings to a protective order and nothing more.

I'll add here something I said during the questions-and-answers with the legislators:

And talking about the need for source code review here, I'm going to speak here as a programmer and a manager. Every piece of software that's ever been written that's longer than just a couple of lines long, that actually does anything substantive, has bugs. It has defects. And if you want to write code that doesn't have defects or if you want to at least have an understanding of what the defects are so that you can manage them, so that you can oversight them (the same way that we have a system of democracy, right, of course there's going to be problems, but we have mechanisms of oversight) -- If in a system that's going to have defects, if we don't have any oversight, if we have no transparency into what those instructions are doing and to what the recipe is, not only are we guaranteed to have bugs; we're guaranteed to have bugs that are harder to track down. And given what we've heard earlier about the fact that it's very likely that in some of these cases there will be discriminatory impacts, I think it's even more important; this isn't just going to be random.

I'll give you an example. HP, the computer manufacturer, they made a web camera, a camera built into a computer or a laptop that was supposed to automatically detect when there was a face. It didn't see black people's faces because they hadn't been tested on people with darker skin tones. Now at least that was somewhat easy to detect once it actually got out into the marketplace and HP had to absorb some laughter. But nobody's life was at stake, right?

When you're doing forensic work, of course in a state the size of New York State, edge cases, things that'll only happen under this combination of combination of conditions are going to happen every Tuesday, aren't they? And the way that the new generation of probabilistic DNA genotyping and other more complex bits of software work, it's not just: Okay, now much of fluid X is in sample Y? It's running a zillion different simulations based on different ideas of how the world could be. Maybe you've heard like the butterfly effect? If one little thing is off, you know, we might get a hurricane.

Transparency And Accountability In Government Forensic Science

In February, I learned that the New York State Assembly was planning a public hearing on government oversight of forensic science laboratories, and then was invited to offer ten minutes of testimony and then answer legislators' questions. This was a hearing held jointly by the Assembly Standing Committees on Codes, on Judiciary, and on Oversight, Analysis and Investigation and it was my first time speaking in this sort of capacity. I spoke on the importance of auditability and transparency in software used in devices the government uses in laboratories and field tests, and open source as an approach to improve these. And I testified to the efficiency, cost savings, security, and quality gains available by using open source software and by reusing and sharing open source software with other state governments. Here's a PDF of my testimony as written, and video and audio recordings are available as is a transcript that includes answers to the legislators' questions. It is a thrilling feeling to see my own words in a government hearing transcript, in that typeface and with those line numbers!

As I was researching my testimony, I found an article by lawyer Rebecca Wexler on the danger of closed-source, unauditable code used in forensic science in the criminal justice system, and got the committee to also invite her to testify. Her testimony's also available in the recordings and transcript I link to above. And today she has a New York Times piece, "How Computers Are Harming Criminal Justice", which includes specific prescriptions:

Defense advocacy is a keystone of due process, not a business competition. And defense attorneys are officers of the court, not would-be thieves. In civil cases, trade secrets are often disclosed to opposing parties subject to a protective order. The same solution should work for those defending life or liberty.

The Supreme Court is currently considering hearing a case, Wisconsin v. Loomis, that raises similar issues. If it hears the case, the court will have the opportunity to rule on whether it violates due process to sentence someone based on a risk-assessment instrument whose workings are protected as a trade secret. If the court declines the case or rules that this is constitutional, legislatures should step in and pass laws limiting trade-secret safeguards in criminal proceedings to a protective order and nothing more.

I'll add here something I said during the questions-and-answers with the legislators:

And talking about the need for source code review here, I'm going to speak here as a programmer and a manager. Every piece of software that's ever been written that's longer than just a couple of lines long, that actually does anything substantive, has bugs. It has defects. And if you want to write code that doesn't have defects or if you want to at least have an understanding of what the defects are so that you can manage them, so that you can oversight them (the same way that we have a system of democracy, right, of course there's going to be problems, but we have mechanisms of oversight) -- If in a system that's going to have defects, if we don't have any oversight, if we have no transparency into what those instructions are doing and to what the recipe is, not only are we guaranteed to have bugs; we're guaranteed to have bugs that are harder to track down. And given what we've heard earlier about the fact that it's very likely that in some of these cases there will be discriminatory impacts, I think it's even more important; this isn't just going to be random.

I'll give you an example. HP, the computer manufacturer, they made a web camera, a camera built into a computer or a laptop that was supposed to automatically detect when there was a face. It didn't see black people's faces because they hadn't been tested on people with darker skin tones. Now at least that was somewhat easy to detect once it actually got out into the marketplace and HP had to absorb some laughter. But nobody's life was at stake, right?

When you're doing forensic work, of course in a state the size of New York State, edge cases, things that'll only happen under this combination of combination of conditions are going to happen every Tuesday, aren't they? And the way that the new generation of probabilistic DNA genotyping and other more complex bits of software work, it's not just: Okay, now much of fluid X is in sample Y? It's running a zillion different simulations based on different ideas of how the world could be. Maybe you've heard like the butterfly effect? If one little thing is off, you know, we might get a hurricane.

Trashing, Pile-ons, Accountability ... and AEDs

I've written a new MetaFilter post, "Distinguishing character assassination from accountability", pulling out quotes from eleven writers from the past 40 years on how we take and charge each other with responsibility and power within communities, and in particular how we do accountability in progressive groups -- from Jo Freeman and Joanna Russ discussing "trashing" in the US feminist movement to people in the last few years and weeks talking about times to get on the phone, making trusting relationships for accountability, and lessons from Occupy. Perhaps the most immediately useful link in there is this "pod" discussion and mapping worksheet.

Speaking of MetaFilter: after the US election in November, I decided to take some concrete steps to be a better neighbor, so I took a CPR and first aid class. In it I learned about how amazing and underappreciated automated external defibrillators are. I did a bunch of reading and wrote up this MetaFilter post:

If someone had a heart attack right next to you, could you get to your nearest automated external defibrillator, grab it, and use it within 3-5 minutes of their collapse? More and more, the answer is yes, because of Public Access Defibrillation (PAD) programs (that statement is from 1995; 2015 update to AHA guidelines).

On average, when a person in the US calls 911 because someone's suffered cardiac arrest, emergency medical responders get to the scene in 8-12 minutes (Red Cross) -- but for people suffering cardiac arrest, for every minute defibrillation is delayed, the chance of survival goes down about 7-10% (American Heart Association, PDF). Bystanders (even untrained ones) who use AEDs on victims can save lives; "Application of an AED in communities is associated with nearly a doubling of survival after out-of-hospital cardiac arrest."

But where's your nearest AED?...

In that post, I commented about how difficult it can be to get PAD data, in New York at least. I ended up sending in a document request to the New York State Department of Health, and need to review what they sent back to me. Also I happened to mention my amateur AED research while talking to my city councilmember at a local Democratic Club meeting, so he might be introducing a bill soon to make the PAD data for NYC more accessible? So that's cool.

Resilience

Fifteen years ago, in my last semester of college, I was planning to set up my own desktop support business while supporting myself as a substitute teacher. I took and passed the California Basic Educational Skills Test, making me eligible to work as a substitute teacher. Then, in late May, just after I thought I had graduated, I found out that I'd made a mistake and I hadn't quite graduated, and to get my bachelor's I'd have to take another class. I took a six-week summer school class that met 4-6pm on weekdays. I started running out of money. I couldn't find temp work that would be fine with me leaving at 3:30pm to make it to class, and I didn't want to ask my parents or Leonard for more money. I started looking for jobs. I felt restless and embarrassed. In early July, I finished the summer school class, and on July 15th, I accepted a customer service job at a bookstore. I stayed there for about a year and then went to work for Salon.com, and I never got back to the teaching and desktop support plans.

Why?

Monday and yesterday, I was riding back home from WisCon with my friend Julia, and I was telling her this, and I was looking back and asking: why? Once I finished the summer school class, why didn't I go back to the plan that I had cared so much about and crafted with such ambition? Right now I'm fairly happy with where I am, but why did I give up on the thing I'd wanted to do?

I look back and I see that my mental health is better now than it was then, and I see that my parents -- though I think they wanted to be supportive -- didn't nudge and remind me, "hey, you can get back to your old plan now" -- Mom wanted me to find a way to regular employment, particularly with a government. And I so wanted to be independent of my parents and my boyfriend that a regular paycheck was so enticing -- and I didn't even consider using unemployment assistance or a credit card to give me more financial leeway. But more than all that, I just wasn't good at the skill of resilience when it came to big life plans and projects. I didn't feel like I was particularly in control of my own life, I think, and so when a big unexpected obstacle popped up, I just defaulted back to taking the opportunities that were in front of me instead of working to make my own.

This morning, catching up on friends' blogs, I see Mary Anne Mohanraj (whom I met eight years ago at WisCon):

...she thought the main difference between me and a lot of other people, is that when I want something, I tend to just try to do it, whereas she, and lots of other folks, would waste a lot of time dithering.

I think that's probably accurate. And I could try to unpack why that is, why I don't tend to hesitate, though I'm not sure I know. Some of it is base personality, some of it, I suspect, is cultural and class background -- being raised in a comfortable economic situation with parents who trained me to work hard, but also expected that I would succeed at whatever I put my hand to.

That gives me a baseline confidence that makes it relatively easy for me to try things, and even when I fail (I flunked calculus, I failed my driving test the first time, I have messed up far more sewing projects than I've succeeded at, I have plants die all the time because I forget to water them, etc. and so on), it mostly doesn't get to me. I can shake it off and either try again, or just move on to something else.

All this reflection is bouncing around in my head, jarring loose thoughts on adaptability, confidence, entrepreneurship, Ramsey Nasser on failure, saying no, danah boyd on the culture of fear in parenting, Jessica McKellar on why she teaches people to program, the big and increasing emphasis Recurse Center puts on self-direction in learning, etc. Love and strength and fear. You know, the little stuff. ;-) Onwards.