Secure Communities by the Numbers, Revisited (Part 1 of 3)

By Jessica M. Vaughan and W.D. Reasoner on December 8, 2011

Download a pdf of this Memorandum

Read Part 2 - Read Part 3


W.D. Reasoner (a pseudonym) is a retired government employee with many years of experience in immigration administration, law enforcement, and national security matters. Jessica M. Vaughan is Director of Policy Studies at CIS.


Summary

This report is the first in a series that will examine outcomes of the Immigration and Customs Enforcement (ICE) agency’s Secure Communities program and how those outcomes have been misleadingly described in one widely circulated study. These misrepresentations, made in a paper titled “Secure Communities by the Numbers: An Analysis of Demographics and Due Process”,1 published by the Earl Warren Institute on Law and Social Policy at the University of California, Berkeley, Law School, have been uncritically re-told by major news media outlets. The Warren Institute report and our reports are based on the same database of actual case histories provided by ICE in response to a Freedom of Information Act request.

Our findings:

  • Contrary to what the Warren Institute reported, the database contains no records of U.S. citizens who were detained by or for ICE. While there are six records for U.S. citizens in the data set, none were listed as having been detained or charged by ICE. Therefore, it is impossible to assert based on this data, as the authors do, that thousands of U.S. citizens, or any number of U.S. citizens, have been arrested by ICE through Secure Communities.
  • The Warren Institute report contains serious methodological and interpretive errors that lead the writers to unsubstantiated conclusions and cast doubt on the credibility of the entire analysis. For example, the authors analyzed only 23 percent of the original random sample requested from ICE. For reasons they fail to explain, they excluded 1,275 of the 1,650 records in their random sample. This is a key issue because they claim, but fail to show, that this data is still representative of the Secure Communities program as a whole. Instead, we believe that their data decisions very likely introduced a significant degree of selection bias that skewed the results.
  • Despite these flaws, which were not difficult to detect, the report’s findings were reported uncritically by a number of leading news outlets, most notably the New York Times. Moreover, these journalists did not note that the organizations that produced this report have been involved in national advocacy efforts to end Secure Communities and scale back immigration law enforcement.
  • ICE’s failure to counter the report’s misleading statements is contributing to the spread of misconceptions about Secure Communities among the media, state and local leaders, and the public. This raises doubts as to the agency leaders’ commitment to full and effective implementation of the program.
  • We agree with the Warren Institute authors on the issue of the need for improved transparency at ICE and its parent Department of Homeland Security (DHS). Nevertheless, we find that the authors should have made more of an effort to seek explanations from ICE for some of the obvious data integrity problems.

Introduction

In October 2011, the Earl Warren Institute at the University of California, Berkeley, Law School and the Benjamin N. Cardozo School of Law jointly published a report, “Secure Communities by the Numbers: An Analysis of Demographics and Due Process”. The report (which we will refer to from here onward, in shorthand fashion, simply as “SCBTN”) focuses on the Secure Communities program operated by an agency within the Department of Homeland Security (DHS): Immigration and Customs Enforcement (ICE). The Secure Communities initiative uses electronic systems to compare fingerprints of individuals taken into custody by state and local police against a DHS fingerprint repository to identify aliens arrested for crimes who may be subject to removal (deportation) from the United States.

The report’s authors made some disturbing findings and assertions, among which were the following:

  • Arrests of United States Citizens. “Approximately 3,600 United States citizens have been arrested by ICE through the Secure Communities program.”
  • Racial / Ethnic Profiling. “Our analysis … raises serious concerns about the level of screening and potential targeting of certain social groups … . Latinos comprise 93 percent of individuals arrested through Secure Communities though they only comprise 77 percent of the undocumented population in the United States … . Community and advocacy groups have also asserted that Secure Communities is, in some jurisdictions, masking local law enforcement agencies’ practice of racial profiling.”
  • Due Process and the Operation of Law. “Only 24 percent of individuals arrested through Secure Communities and who had immigration hearings had an attorney compared to 41 percent of all immigration court respondents who have counsel … . Only 2 percent of non-citizens arrested through Secure Communities are granted relief from deportation by an immigration judge as compared to 14 percent of all immigration court respondents who are granted relief.”

We decided to review the analysis and findings of the report in order to determine whether they are well-founded. To do this, we asked ICE officials to provide us with exactly the same dataset that they had given to the report’s authors so that we could examine it for ourselves. They provided us with a spreadsheet that they stated complies with our request. A read-only copy of the dataset that was provided to us can be found at https://cis.org/sites/cis.org/files/articles/2011/sc-by-numbers-dataset.xls for those who wish to examine it themselves.

We determined that the SCBTN report is flawed in several significant areas, and decided that publishing our own findings was important in the public interest because a number members of Congress, special interest groups, and opponents of immigration enforcement are already quoting the report2 — including those portions most erroneous or suspect — in interviews and articles with media outlets, many of which have uncritically accepted the findings and analysis of the report.

The most notable example was an article written by New York Times immigration reporter Julia Preston, which ran on October 19, 2011, one day before the official release of the study on October 20. Preston’s lead paragraph said, “A deportation program that is central to the Obama administration’s immigration enforcement strategy has led disproportionately to the removal of Latino immigrants and to arrests by immigration authorities of hundreds of United States citizens, according to a report by two law schools using new, in-depth official data on deportation cases.” Preston later presented the findings on the National Public Radio morning news program The Takeaway.3 As we will show, neither of these allegations is substantiated by the data used in the report. While Preston deflated the Warren Institute’s estimate of the number of U.S. citizens who were allegedly arrested by ICE and included statements by ICE Director John Morton rejecting the findings, still her story preserved the insinuation that Secure Communities had resulted in the wrongful detention of American citizens.

We are also concerned that ICE has materially contributed to acceptance of the report’s findings through its silence. This is part and parcel of a developing pattern in the agency, which seems to prefer operating in the shadows rather than the sunshine. Silence and failure to routinely provide timely and accurate statistical information inevitably leads to misunderstandings, erroneous conclusions and assertions, and vacillation within the agency itself. Thus ICE in no small measure has only itself to blame for much of the public confusion and controversy surrounding its programs, most especially including Secure Communities. This is both unfortunate and unnecessary.

Because of our concerns, we decided to prepare a series of papers in which we will discuss the report’s analyses and findings made in the report. We have divided them into three broad areas, consistent with what we perceive of as the main themes addressed in the report:

  • Analyses and findings relating to arrests of United States citizens;
  • Analyses and findings relating to allegations of racial or ethnic bias; and
  • Due process and operation of the rule of law, versus perceived flaws in the program.

This is the first of those papers, focusing specifically on the report’s assertions with regard to the arrests of U.S. citizens.

Background

Analyzing a program based on its statistical output can be a risky business at the best of times. It’s not for the faint of heart, weak of mind, or emotionally committed, because done properly it requires a complete knowledge of the program’s operations and parameters, a clear understanding of the dataset, and sufficient dispassion to be able to see what’s there and what’s not there, instead of what you want to see. Flubbing the analysis doesn’t just potentially do an injustice to the program being studied; it also puts the analysts’ credibility and reputations at risk.

The risks are doubled when the program involves criminal justice or immigration law enforcement because the datasets are incredibly complex and the subject matters are inherently controversial, touching as they do on sensitive matters of race, ethnicity, nationality, arrest, imprisonment, and deportation. A program such as ICE’s Secure Communities initiative ups the ante even further, sitting as it does at the intersection between public safety and immigration policy.

These are the thoughts that crossed our minds when we finished reading the SCBTN report.

It is significant that the FBI is so loath to have its data misinterpreted that it routinely posts a caveat on its Uniform Crime Report products.4 Federal immigration officials in the Department of Homeland Security might do well to emulate that practice if the SCBTN report is indicative of the analysis one can expect from private entities.

One would think that institutions of higher learning — particularly those involving the study of law — would tread lightly, wading into such deep waters, for obvious reasons: academics who produce questionable analyses are opening themselves up to doubts about whether they did not comprehend the statistics they studied or, alternately, whether they skewed the data to meet their preconceptions.

The Data

The data we discuss were originally obtained through a Freedom of Information Act (FOIA) lawsuit brought by three advocacy groups that have been active and outspoken in opposition to Secure Communities: the National Day Laborer Organizing Network, the Center for Constitutional Rights, and the Cardozo School of Law’s Immigration Justice Clinic (Peter Markowitz, the director of this clinic, is one of the SCBTN authors). Under the terms of the partial settlement of the groups’ complaint, ICE was ordered to provide, among other records, a list of individuals identified as “IDENT matches,” or “hits” against the DHS databases by the Secure Communities program between October 1, 2008, and January 31, 2010. ICE assigned an identification number to each individual; no names or other personal identifiers were supplied.

The advocacy groups took a random sample of 1,650 numbers from this list. ICE then extracted detailed case information from its files for 502 of the 1,650 individuals. According to the SCBTN report, there were no ICE records for the other 1,148 records. This could be for a variety of reasons, including the fact that the person was a naturalized U.S. citizen or lawful permanent or temporary resident and had no immigration violations or charges, or that the individual was here illegally, but was never encountered by ICE. These 502 cases make up the starting database that was provided to the SCBTN researchers. At our request, ICE provided us with a spreadsheet of cases that they say is identical to what was provided to the SCBTN researchers.

This set of 502 cases can best be described as a sub-set of a random sample of all Secure Communities cases that includes those individuals who now have or previously had an ICE record. It can no longer be considered a random sample of Secure Communities cases, because the authors did not explain or establish that these 502 individuals share the same key characteristics of the 1,148 discarded cases that do not have an ICE record, and that the parsing of the data did not introduce selection bias. This step is critically important to the credibility of the analysis. Moreover, it is absolutely essential if the authors intended to make any extrapolations to the national level, as they ultimately did. Since they did not take this step, their claims as to the relevance of this dataset to the entire Secure Communities program are questionable, and we are troubled that the authors included no nuanced or cautionary language about the data limitations to temper their broad assertions and conclusions.

Unfortunately, the SCBTN researchers went a step further in parsing the dataset by excluding another 127 of the 502 cases, or 25 percent of the cases, from their analysis. They state in the appendix to their report that they discarded these cases for three reasons: 1) the individuals had prior immigration enforcement actions that pre-dated Secure Communities, meaning in all likelihood that they were recidivists, or serial immigration violators; 2) they were arrested by an agency other than ICE, meaning they were likely the subject of a non-immigration criminal warrant, and discovered as a result of the Secure Communities screening; and 3) there was not enough information in the records provided. The SCBTN authors provide no explanation or justification that this smaller data sub-set is still representative of the entire Secure Communities caseload, or even the smaller set of 502 cases.5

We could not replicate this parsing, but believe it is very likely that discarding these cases skewed the analysis in some significant ways. First, it further reduces the randomness of the sample in even more obvious ways and further strains the credibility of any extrapolations to the national level. For example, the database includes three pieces of information for all 502 cases: country of citizenship, sex, and whether ICE arrested them. The information in the citizenship field was the basis for several key assertions made by the SCBTN authors (on the matter of U.S. citizens and on the ethnicity of arrestees) — yet for some reason the SCBTN authors chose to draw conclusions based on the smaller selected group of 375 records, not the entire set of 502 for whom information is available. This distorts the results in a significant way, as we will describe below.

Second, we can think of no proper reason to have excluded the cases of aliens with prior immigration violations or who were wanted by other law enforcement agencies. The former cases are (appropriately) a high priority for ICE enforcement action and the latter cases would include some of the most serious criminal alien offenders. These are exactly the type of individuals that the Secure Communities program was created to help identify for ICE. We think it is possible, perhaps even likely, that the exclusion of these cases might have artificially inflated the proportion of lesser offenders in the sample, leading to inaccurate conclusions about who is being targeted for enforcement action by ICE, and thus stoking further controversy.

We could not determine what kind of missing information was considered by the SCBTN authors to be critical and which was not. We found that most of the records had information in most of the data fields. But for these and other reasons — including some concerns we have about the integrity of the data on ICE’s end — all of the analyses and conclusions of the SCBTN authors must be viewed with the utmost caution.

However, we recognize that despite the limitations we have outlined, this database of 502 ICE cases is worth examining and might provide insight into how ICE has managed the Secure Communities program. So, for the moment, let us set aside the larger data issues and consider the information about these ICE cases on its face. What about those U.S. citizens? We found that not only did the authors commit serious methodological errors and exaggerate the relevance of this data to the national picture, they also misrepresented the raw information that they had available to analyze.

Arrests of United States Citizens

The report makes these assertions:

  • “Approximately 3,600 U.S. citizens have been arrested by ICE through the Secure Communities program.”6
  • “One of the most disturbing findings in our research is that 1.6 percent of cases we analyzed were U.S. citizens … . We find that approximately 3,600 U.S. citizens have been apprehended by ICE [through the Secure Communities program].”7
  • “On its Secure Communities website, ICE acknowledges that there might be IDENT matches, or hits, for U.S. citizens for a number of reasons, including that naturalization data has not been updated in its databases. ICE has never published any data indicating the number or percentage of citizens who have been apprehended through Secure Communities. If Secure Communities were working properly, U.S. citizen hits should never result in the apprehension of such individuals for deportation because U.S. citizens cannot be deported.8 (emphasis in original)

Our review of the SCBTN report relating to United States citizens is split into two parts: 1) the finding that United States citizens contained in the dataset were placed into removal (deportation) proceedings; and 2) the more broad assertion that about 3,600 United States citizens have been arrested as the result of Secure Communities.

Five United States Citizens were Placed into Deportation Proceedings. Pages 4 and 5 of the SCBTN report contain this statement: “One of the U.S. citizens in our dataset appears to have been arrested on a criminal, not on an immigration, charge which indicates that all U.S. citizen apprehensions are not necessarily unlawful. However, the best available data on the remaining five U.S. citizens apprehended in our dataset suggests that they were wrongfully apprehended for civil deportation. (emphasis in original)

One is left with the very clear impression that five out of six United States citizens were arrested and placed into removal (deportation) proceedings as the result of Secure Communities fingerprint matches. To see whether this was true, we looked at the relevant portions of the dataset. First, we examined the tab on the spreadsheet labeled Item No. 15, which contains the countries of nationality. Table 1 shows an extract of that tab, which indeed shows six United States citizens.

Table: Extract from "Citizenship" Tab, Isolating US Citizens Identified in the Dataset

Next, we examined the spreadsheet tab labeled Item No. 3, which contains a Yes/No column entitled “Arrested or Booked by ICE.” Table 2 below shows an extract of that tab. Once again, we see the same individuals, as identified by their Plaintiff ID numbers. So far so good.

Table: Extract from "Arrested or Booked by ICE" Tab, Isolating US Citizens Identified in the Dataset

Then we looked at the additional relevant tabs to determine whether these individuals were in fact placed into removal proceedings, and this is where the report’s assertions fall apart.

  • We examined the Encounter/Detainer/Apprehension columns tab, labeled Item No. 4, to determine whether ICE placed detainers on these six. They did not.
  • We additionally looked at the tab labeled Item No. 5, the data field that reflects the date that a legal charging document in deportation proceedings was prepared. None of the six are reflected in that tab.
  • We moved to the tab labeled Item No. 6, the data field that reflects the date that such a charging document was issued. Again, none of the six are reflected in that tab.
  • We proceeded to the tab labeled Item No. 7, the data field that reflects the date that a charging document in deportation proceedings was served on the individual. None of the six are reflected in that tab.
  • Finally, we looked at the tab labeled Item No. 8, the data field that shows what legal offenses under the Immigration and Nationality Act (“INA charge” as reflected in the column) were used in the charging documents. Once again, we see nothing related to the six.

Our analysis makes clear — contrary to the SCBTN report — that none of the six United States citizens taken into custody was placed into removal proceedings. The assertion is demonstrably false, and should have been evident to the report’s authors based solely on the data available to them. In fact, it apparently was evident to the authors, because they mention in an end note (No. 41) that none of the U.S. citizens were issued ICE detainers. Given their assertions that U.S. citizens were targets of ICE, it is puzzling to us why this key piece of information was relegated to an end note rather than reported in the narrative.

A more reasonable assumption is that, like the one individual pointed out in the SCBTN report, the other five were also arrested for criminal violations — after all, it is important to remember that DHS officials who enforce administrative deportation laws are also charged with enforcing a variety of criminal laws relating to both immigration and customs. These include crimes such as human smuggling, trafficking, and immigration fraud, which are often committed by U.S. citizens as well as aliens.

But we were curious as to the disposition of these six, so we simply asked ICE to tell us the details and circumstances surrounding their arrests. We were disappointed that the agency declined to provide the information on the record. Off the record, however, a source at ICE who is familiar with the records but not authorized to speak about them told us the following:

  • One U.S. citizen was taken into custody by Customs and Border Protection (CBP) inspectors reentering the country at a border crossing, after they discovered an outstanding felony warrant for his arrest.
  • Another U.S. citizen was taken into custody by CBP inspectors after they discovered him attempting to smuggle marijuana into the country — he was later turned over to local law enforcement officers for prosecution on the offense.
  • Two naturalized U.S. citizens serving time in a state penal institution were screened based on their foreign place of birth — once their naturalizations were confirmed, no further actions were taken.
  • Two U.S. citizens were taken into custody by CBP officers and charged with the criminal violation of alien smuggling.

We have three comments to make:

First, we are surprised that agency officials declined official comment; we believe it is in their own interest, as well as in the public interest, to set the record straight. That is beyond our purview to accomplish.

Second, because the information was provided to us off-the-record, we cannot vouchsafe its accuracy — although we firmly stand by our prior observation that the SCBTN report is wrong on its face to allege that U.S. citizens had been placed into deportation proceedings, because that is evident even without reference to this “off-the-record” information.

Third, if accurate, the circumstances of these cases raise questions of data integrity. As shown in Table 2, the dataset provided to the authors of the report shows, apparently erroneously, that these six were “arrested / booked by ICE.” That cannot be accurate in light of the case details given to us off-the-record. We are obliged to ask why non-ICE officers are able to input data into official systems showing these apprehensions as having been part of the Secure Communities program, when apparently they were not. There may be a good explanation, or maybe it is a data entry error. But the abnormality should serve as a warning that there may be other problems with the data, and analysts should interpret the results with caution and report the results with appropriate caveats. The SCBTN authors either did not notice or chose to ignore these warning flags.

3,600 United States Citizens Have Been Arrested by ICE Through the Secure Communities Program. The authors of the report made this broad assertion through simple math — divide the United States citizens found in the nationalities tab of the dataset (6) by the number in the sub-sub sample pool (375) to arrive at a percentile (6÷375 = .016) and then apply that to all ICE apprehensions ascribed to the Secure Communities program.

As mentioned in the data section above, there are serious problems with both the methodology and the conclusions they draw. First we doubt that the sample pool is large enough to reach any over-arching conclusions about the number of United States citizens apprehended over the course of the program to date. And as already discussed it is very unlikely that the small sample that remained after the removal of so many cases is representative. It is obvious to us that the set of 375 cases exaggerates the proportion of citizens in the Secure Communities caseload. The original dataset of 502 cases is in our estimation more likely to approximate a random sample. The database includes citizenship information on all 502 cases; six of these were U.S. citizens. None of the 127 cases that the SCBTN authors threw out of the sample was a U.S. citizen. Therefore, the authors of SCBTN should have used 502 as a denominator in the above calculation, not 375. This produces a quotient of .012, or a U.S. citizen percentage of 1.2 percent, not 1.6 percent.

But, assuming solely for the sake of argument that the figure is both representative and true, there is nothing illegal or unethical about those apprehensions if they don’t involve placing citizens into removal proceedings — a quantum leap the authors obviously wish us to take, with no factual foundation. As has been discussed, and as the authors tacitly acknowledge, federal officers in both ICE and CBP (both of which are massive organizations) enforce a whole host of criminal laws that apply equally to citizens and aliens.

It is also important to point out that the authors are extrapolating results from a tiny sample of 502 cases from a national Secure Communities caseload of more than 157,000. We question whether this sub-set of a data pool was large enough for the authors to confidently justify the extrapolation, which they have made without benefit of any kind of caveat or cautionary note to the contrary. However, we cannot be certain one way or the other unless and until ICE comes forward to provide us with the facts. We strongly encourage them to do so.

Conclusion

Observations About the Warren Institute and Benjamin Cardozo Law School Authors of SCBTN. It is pertinent to note that both the Warren Institute and the Cardozo Law School’s Immigration Justice Clinic are co-partners with the National Day Laborer Organizing Network, among other organizations, in seeking to have the Secure Communities program shut down, whether that goal is achieved through politics, the courts, public opinion or influence, or otherwise.1 We believe such a goal, and the philosophical underpinnings that make them want to achieve it, should have been made plain by the authors, and additionally should have been mentioned by the news media that reported on the findings, so that the public would be aware of their leanings. Looked at in this light, publication of the SCBTN report appears to us to simply be one more tool in this coalition’s effort to defeat Secure Communities, and not as a dispassionate, objective scholarly exercise.

Passion toward one’s work, and those things one believes in, is generally an excellent quality. It is the foundation for many kinds of philanthropic work; and usually provides us an inspiration to try to excel in those things we are passionate about. It can also, however, instill a negative bias in those who feel passion about a subject when they are confronted with things that do not fit their philosophical underpinnings. For this reason, those who engage in research and analysis must work particularly hard to put their preconceptions aside. This did not happen here. The SCBTN report has been inappropriately used as a tool to further the goal of dismantling Secure Communities, at the expense of dispassion and academic rigor.

The flaws in the SCBTN report go beyond simple, technical inaccuracies. They cut to the heart of the report, and are so substantive — including inaccurate assertions alleging the placement of U.S. citizens into deportation proceedings — as to cast virtually all of it into doubt.

We call upon the authors to amend and clarify the report, and to retract statements, whether broad-based or specific, that cannot be substantiated with specific, detailed evidence.

Observations about ICE. Data is the lifeblood of an agency.

  • Leaders use it to chart their course, and to determine progress of the agency toward its strategic goals and objectives;
  • The agency’s departmental watchdogs and congressional overseers use it to make strategic decisions about direction and to assess fiscal and human resource needs; and
  • Researchers and interested members of the public use it to assess the agency’s efficiency and accountability.

For all of these reasons, openness and data integrity are paramount. There may be many things about which we and the report’s authors disagree, but this is not one of them. Where Secure Communities is concerned, we believe that ICE has materially contributed to acceptance of the report’s findings, even where they may be misleading, through its silence. This is part and parcel of a developing pattern in the agency, which seems to prefer operating in the shadows rather than the sunshine. Silence and failure to routinely provide timely and accurate statistical information inevitably leads to misunderstandings, erroneous conclusions and assertions, and vacillation within the agency itself. Thus ICE in no small measure has only itself to blame for much of the public confusion and controversy surrounding its programs, most especially including Secure Communities. This is both unfortunate and unnecessary. It also raises questions as to whether the agency leadership is fully committed to the program and its objectives.

Every now and again our leaders — usually those embroiled in dissension — assert that if they and the agencies they represent are displeasing both the left and the right, it must mean that they’re doing something right. In a speech last month, DHS Secretary Janet Napolitano said, “Not surprisingly, our policies have been simultaneously described as engaging in a mean-spirited effort to blindly deport record numbers of illegal immigrants from the country and, alternatively, as comprehensive amnesty that ignores our responsibility to enforce the immigration laws … . Two opposites can’t simultaneously be true.”10 Our take on that old saw is: Perhaps. Sometimes. And sometimes it may simply mean that they are so fundamentally tone-deaf to the rights and expectations of the American people to public accountability and government-in-the-sunshine that they are failing on all fronts.

We understand and respect the notion that some information is law enforcement sensitive — generally speaking, the kinds of things that reveal law enforcement sources and methods — but statistical abstracts do not fall into this category. Researchers and interest groups should not be forced to resort to protracted and costly lawsuits to obtain access to such data. “Law Enforcement Sensitive” should not be used as an excuse to withhold programmatic information from the public. We believe that the Secure Communities program is a fundamentally sound program, and that release of information and data surrounding its workings will only strengthen public confidence, not undermine it. The question is, does ICE share that view?

Recommendations

  1. As the FBI does with its Uniform Crime Reports, ICE should routinely validate and publicize its Secure Communities data and, when necessary, provide explanatory comments or insert appropriate, carefully worded caveats about the use and limitations of the data. If ICE is incapable of doing so, then they should cede that responsibility to the DHS Office of Immigration Statistics, which consistently does a credible job of publishing yearly statistical information of interest to researchers all across the philosophical spectrum.
  2. We call upon ICE to officially provide the specifics surrounding the arrest of the six United States citizens found in the sampling of 375 cases reflected in the report. This can be done without implicating the Privacy Act, since the information can be provided in the abstract, and without resort to biographical or other individual identifying information.
  3. Consistent with our general recommendation in 1) above, we encourage ICE to publish data on the number of U.S. citizens who have been arrested in toto as the result of Secure Communities fingerprint matches, in order to confirm or definitively refute the extrapolation made in the report that “[a]pproximately 3,600 United States citizens have been arrested by ICE through the Secure Communities program.” In our view (and as explained above), the dataset used was too small to make such a broad-based assertion; however, it is ICE’s responsibility to speak affirmatively to the facts.

End Notes

1 Aarti Kohli, Peter L. Markowitz, and Lisa Chavez, “Secure Communities by the Numbers: An Analysis of Demographics and Due Process”, the Chief Justice Earl Warren Institute on Law and Social Policy, Berkeley, Calif., October 2011, http://www.law.berkeley.edu/files/Secure_Communities_by_the_Numbers.pdf.

2 See, for instance, the statements made by Felipe Matos of the organization Presente.org during a television interview with Fox News on October 24, 2011, at http://cis.org/TVInterviews/Camarota-FOX102411-SecureCommunities.

3 Julia Preston, “Latinos Said to Bear Weight of a Deportation Program”, The New York Times, October 18, 2011, http://www.nytimes.com/2011/10/19/us/latinos-said-to-bear-weight-of-depo.... An audio link to the Takeaway story is on the same page.

4 See, e.g., “Crime in the United States 2010”, Criminal Justice Information Services Division, Federal Bureau of Investigation, which cautions in pertinent part, “Each year when Crime in the United States is published, many entities — news media, tourism agencies, and other groups with an interest in crime in our Nation — use reported figures to compile rankings of cities and counties. These rankings, however, are merely a quick choice made by the data user; they provide no insight into the many variables that mold the crime in a particular town, city, county, state, region, or other jurisdiction. Consequently, these rankings lead to simplistic and/or incomplete analyses that often create misleading perceptions adversely affecting cities and counties, along with their residents … . Until data users examine all the variables that affect crime in a town, city, county, state, region, or other jurisdiction, they can make no meaningful comparisons.” (emphasis in original), http://www.fbi.gov/about-us/cjis/ucr/crime-in-the-u.s/2010/crime-in-the-....

5 Incomplete data is a common hazard in policy analysis based on statistics and government records. Analysts have three main options when confronted with data sets that are missing information. They can impute, or allocate, data, as the Census Bureau and others do, which essentially is guessing what the missing values are. They can use a process called “hot decking,” where values are inserted based on values in similar cases in the dataset. Or, they can simply throw out the cases with missing information. Discarding the incomplete cases might have seemed like the best choice for the purposes of this analysis, but once this option was chosen, then the authors cannot claim that the results are representative of the larger picture; the dataset is no longer a random sample. This is especially important considering that in this case the number of records discarded is equal to 25 percent of the original set. And, the authors should provide an explanation of the protocols used to discard cases, or include in the analysis descriptive phrases such as “of the records for which we have information … .”

6 See the “Key Findings” inset box on p. 2 of the report.

7 See the inset box on p. 4 of the report.

8 See the narrative text immediately under the “U.S. Citizens Apprehended” subtitle on p. 4 of the report.

9 See, for instance, “Rights Groups Call Effects of Shrouded ICE-Local Law Enforcement Collaboration on Communities Insidious”, on the Cardozo Law School’s Immigration Justice Clinic website, at http://www.cardozo.yu.edu/MemberContentDisplay.aspx?ccmd=ContentDisplay&.... See, also, Aarti Kohli and Antonia Hernández, “On Security: A broken promise of ‘smart’ immigration enforcement”, SFGate (San Francisco Chronicle online), October 7, 2009, at http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2009/10/06/EDF01A1TI1.D.... Ms. Kohli is Director of Immigration Policy and Legislative Counsel for the Earl Warren Institute.

10 Janet Napolitano, quoted in Brian Bennett, “Obama administration showing leniency in immigration cases”, Chicago Tribune, November 17, 2011.