Governance & Risk Management , Incident & Breach Response , IT Risk Management

Texas Retirement Agency Portal Breach Affects 1.25 Million

Coding Error Allowed Some Logged-In Members to View Others' Information
Texas Retirement Agency Portal Breach Affects 1.25 Million

A coding error in a portal of the Employee Retirement System of Texas inadvertently allowed some users to view the information of others, potentially exposing information on nearly 1.25 million of its members.

See Also: Webinar | Identity Crisis: How to Combat Session Hijacking and Credential Theft with MDR

In a statement posted on its website, ERS, which administers retirement benefits, including health insurance, for state workers, says that on Aug. 17, it learned about a security issue involving its password-protected ERS OnLine portal that allowed "some, but not all," ERS members to see some other members' or certain beneficiaries' information.

Prior to the flaw being recently corrected, "if a member went to the specific function and modified the search, they might have been able to see the first and last names, Social Security numbers and ERS member identification numbers - known as EmplIDs - for a limited group of members," the statement says.

"I've seen a significant degradation of programming rigor over the past 10 to 15 years in many companies."
—Rebecca Herold, The Privacy Professor

"Based on our thorough investigation of the incident, it is very unlikely that most members' information was accessed, and we have no reason to believe that any information was used for fraudulent purposes," the statement says.

Nevertheless, ERS on Oct. 15 reported the incident to the U.S. Department of Health and Human Services as a "unauthorized access/disclosure" health data breach impacting nearly 1.25 million individuals, according to HHS' Office for Civil Rights' HIPAA Breach Reporting Tool website.

Commonly called the "wall of shame", the website lists health data breaches impacting 500 or more individuals. As of Oct. 24, the ERS incident was the second largest health data breach added to the tally so far this year.

Five Largest Health Data Breaches So Far in 2018

Breached Entity Individuals Affected
Iowa Health System/UnityPoint Health 1.4 million
Employees Retirement System of Texas 1.25 million
California Dept. of Developmental Services 582,000
MSK Group 566,000
LifeBridge Health 538,000
Health Management Concepts 502,000
Source: U.S. Department of Health and Human Services

The largest breach listed this year is a hacking/IT incident impacting 1.4 million individuals reported on July 30 by Iowa Health System, which does business under the name UnityPoint Health.

Common Culprit?

Other incidents involving programming or coding mistakes that enabled individuals to view others' protected health information - whether via search engines, web portals or due to misconfigured servers - litter the wall of shame.

Those include an incident reported to OCR in June 2017 by the University of Iowa Hospitals and Clinics involving health data that was accidentally exposed on an application development website for about two years.

Some of these programming error mishaps have resulted in enforcement actions by OCR. For example, in 2016, the agency smacked California-based St. Joseph Health System with a $2.14 million penalty after investigating a 2012 breach that left PHI of nearly 32,000 individuals exposed to internet searches for more than a year.

Even when smaller scope incidents involving programming mistakes don't result in enforcement actions or major breach reports to regulators, they can lead to embarrassment and undermine public trust.

For instance, the rocky technical rollout of Obamacare's Healthcare.gov website in October 2013 was made even worse by additional headlines about a glitch that allowed a North Carolina consumer to access personal information of a South Carolina man.

Cause for Concern

The ERS portal incident could have been worse had the members' information been exposed to the public on the web and accessible via search engines.

Nonetheless, the incident is still a cause for concern, considering the vast number of ERS members' information that potentially might have been searched by others.

"The 1.25 million members could have used - by guessing, based on member ID formatting, etc. - the [information of] other 'certain members' who had this specific function," says Rebecca Herold, president of Simbus, a privacy and cloud security services firm, and CEO of The Privacy Professor consultancy. "This information could, of course, be used for a wide variety of identity fraud and theft."

In its statement, ERS says that although it "has no reason to believe that any information inappropriately viewed by members was used for fraudulent purposes," it's providing free identity restoration services through Experian to members and beneficiaries who might have been affected.

ERS did not immediately respond to an Information Security Media Group request for comment on the incident.

Programming Problems

So why do these coding mistakes putting PHI at risk happen in the first place, and how can they be avoided - or at least detected sooner?

Some of the problems are rooted in subpar programming skills, as well as vendor issues, Herold says.

"I've seen a significant degradation of programming rigor over the past 10 to 15 years in many companies - especially in contractor businesses who provide programmers, often at rates as low as $9 per hour," Herold says. "What I've also seen in many outsourcing programming businesses, located not only in the U.S., but also in other countries such as India, Singapore, Philippines, Canada and Mexico, is a complete lack of effective change control policies and procedures."

Many of these organizations, and also in-house programmers, are making coding changes directly within the production code and not thoroughly testing the changes to ensure they work as intended, Herold says. "And typically the changes are never are tested to see any data security impacts," she adds.

"It truly is scary to think about how these sloppy, unsecure programming practices are regularly occurring in many to most of the online sites hundreds of millions of people use each day for critical financial as well as healthcare purposes."

"I do think there are undiscovered web app bugs in the healthcare industry."
—Kate Borten, The Marblehead Group

Kate Borten, president of privacy and security consulting firm The Marblehead Group, offers a similar perspective, adding that coding problems putting PHI at risk are likely more widespread than many realize.

"I do think there are undiscovered web app bugs in the healthcare industry," she says. "Despite training more college graduates in the field of cybersecurity, most developers are self-taught when it comes to security. The good news is that there are an increasing number of resources available, such as organizations including the Open Web Application Security Project, and products to test web code for well-known flaws."

Taking Action

Organizations can take action to prevent and detect coding mistakes that can lead to breaches.

"With even more diverse and complex systems and programs/applications running on current business networks and the internet, it is more important than ever to follow effective change control processes - especially considering almost every type of program today involves some type of personal information, in some way," Herold says.

Information systems are continuously updated, patched or upgraded, notes privacy attorney David Holtzman, vice president of compliance at security consultancy CynergisTek. "It is critical that there are change management policies and procedures in place when enacting changes in the information system or its environment," he says. "Any organization creating or maintaining sensitive personal information should perform an enterprisewide risk assessment to identify the threats and vulnerabilities to the confidentiality, integrity and availability to the data."

In addition, an important part of any change management process is to carefully monitor information systems to ensure that security safeguards are not impacted when implementing upgrades or modifications to the network, Holtzman notes.

"Penetration testing can also be an effective measure to test the security of the information system after changes have been made to a network or its operating environment. But these measures should be considered the last steps in a well-planned, managed process that includes testing any changes before putting them into production."

Organizations should use a risk assessment to develop a plan of action that prioritizes those areas that pose the highest risk of compromise to the information system, the consultant suggests. "Make it a management imperative in your organizations to follow through on investment and attention to information security," he says.

Other Steps

Several other steps that organizations should take to address potential coding mistakes that lead to security incidents, Herold says, are:

  • Implement data security core concepts, including ensuring that change controls, access controls and other critical information security practices are implemented within websites and applications;
  • Create effective and consistently followed change control procedures, including for changes involving new program code or feature implementations, installations of new code into the existing business environment and fixes to address discovered errors or software bugs;
  • Objectively test the changes before implementing into production;
  • Log all changes, as well as dates, times and names of those involved with changes.
  • Address "human factors," such as ensuring that individuals are trained and that staff - including those working for contracted third parties - are performing changes according to the organizations change control procedures.

"Technology tools - for instance, encryption, two-factor authentication, etc. - are necessary to support information security and privacy, but they cannot, by themselves, provide effective safeguards for business information, coding or systems changes," Herold says. "Well-written and correct code is necessary for ensuring incidents and breaches do not occur."


About the Author

Marianne Kolbasuk McGee

Marianne Kolbasuk McGee

Executive Editor, HealthcareInfoSecurity, ISMG

McGee is executive editor of Information Security Media Group's HealthcareInfoSecurity.com media site. She has about 30 years of IT journalism experience, with a focus on healthcare information technology issues for more than 15 years. Before joining ISMG in 2012, she was a reporter at InformationWeek magazine and news site and played a lead role in the launch of InformationWeek's healthcare IT media site.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing healthcareinfosecurity.com, you agree to our use of cookies.