Wednesday 30 December 2009
The eCrime of the Decade goes unpunished
The Department of Justice Assistant Attorney General Breuer said that they “… will not allow computer hackers to rob consumers of their privacy and erode the public's confidence in the security of the marketplace,” adding, “criminals like Albert Gonzalez who operate in the shadows will be caught, exposed and held to account. Indeed, with timely reporting of data breaches and high-tech investigations, even the most sophisticated hacking rings can be uncovered and dismantled, as our prosecutors and agents demonstrated in this case.”
The reality is that the hacking ring has not been broken, and Mr. Gonzalez’ conspirators are free to continue their illegal activities. The technological vulnerabilities that allowed the Heartland breach to occur are still prevalent in the global IT infrastructure. Verizon has reported that these vulnerabilities are the growth exploit for cyber-criminals.
It would seem that enterprises and others should realize that they have a high likelihood to be hacked, given the prevalence of the vulnerabilities, and should take immediate precautions. Knowing that these vulnerabilities are present gives these enterprises a responsibility and obligation to protect their customers from the Gonzalez’ of the world, especially knowing only a few will ever be caught.
Wednesday 16 December 2009
Drinking "data" securely from Amazon’s Cloud
Regardless of the pricing model for computing power, there is absolutely NO correlation with the level of security provided.
When drawing from any shared pool we need to ensure that all drinkers only use special drinking-straws with filters built in like those at istraw. Such "virtual cloud straws" are simply filtering firewalls that only permit "clean and safe drink" to pass the lips of the drinker.
Now, a security drinking straw that also runs in the cloud, is flexible and can be powered on a "pay per filtering" model fits the vision of the cloud. How would it be to provide a virtualized database firewall that runs in the Cloud – filtering out unwanted database accesses and keeping your database from being "sucked dry" or "poisoned"?
Watch this space.
Wednesday 9 December 2009
What’s ahead for 2010?
Their report is titled "Data Breaches Getting More Sophisticated", but the reality is that the SQL Injection attacks obeying the 80:20 rule are the result of really "dumb" application development compounded by lax security and missing defenses. The headline should really read "Data Defenses Must Get More Sophisticated".
We are dealing with a quickly evolving threat ecosystem, and companies today need to take measures that assume the hackers will enter the network through the very applications that they have invested in. What provisions do you have in place that will stop the identification and stealing of information? If you can’t answer that question quickly and clearly, you may be in for a difficult 2010.
Tuesday 1 December 2009
Database Security taken seriously at last
With this acquisition, we see database activity and transaction monitoring becoming central to any organization’s security plan. Nick Selby has written a great perspective on why this acquisition marks a change in the overall security landscape and predicts increased quality, better -integrated components, and cross-enterprise security programs. (Note: Secerno is mentioned in the article). In the coming years, not having real-time knowledge of your database’s activities and the ability to block threats will seem antiquated – almost like a company not having a firewall. We welcome this next phase of the security industry – and it’s been a long time coming.
Tuesday 17 November 2009
The T-Mobile “Defense”
The key word in this excuse is "knowledge".
- What did they know about their data and the way it is/was used?
- What did they know about the data leak?
- What do they know that they are not telling us?
- Did they, in fact, have any actual knowledge or did they simply choose not to look?
The UK information commissioner Christopher Graham is advocating custodial sentences for this type of abuse of personal data. Until there is sustained public understanding resulting in political pressure I doubt this will ever become a reality in the near term.
Perhaps it is not just T-Mobile who choose not to see – maybe it is us, the people, who let our personal information float freely, without truly understanding how it is used.
Wednesday 11 November 2009
SQL injection sees a big payout
In the Spring of 2008, fully automated SQL injection attacks were spreading rapidly – but the reports were focused on the visible outcomes and listed them as “i-frame attacks” rather than their root-cause of a database attack. At the time we warned that SQL injection attacks were both increasing and becoming more severe, moving to attacks whose purpose was to serve as much malicious code on as many web sites as possible. In the few months between that time and November 2008, the attacks moved beyond proof of concept and annoyance hacks to direct database manipulation and fraud. One year later, our call to action remains the same: all companies need to address the vulnerabilities the web environment poses to their databases. We recommend additional security precautions be added, so that SQL injection attacks are blocked, ensuring that the database cannot be used directly to mount a costly and embarrassing data breach.
Thursday 5 November 2009
First the telecos …?
We fully expect data protection measures to extend to different business types and industries, but these extensions should be done in a measured, controlled manner. The very worst thing that the EU could do is impose broad, blanket data protection measures that would affect all industries immediately. Historically, these measures (for example Sarbanes-Oxley in the United States) have created compliance costs and headaches that can be as difficult to maneuver as the problems they were intended to solve.
Rather than bemoan the fact that the measures are starting with the telecos, let’s look to this an important first move that is being done correctly and gives all businesses time to prepare for the inevitable cross-industry data protection measures that will emerge in the coming years.
Tuesday 27 October 2009
When Government is too much like the private sector
Governments may have fallen behind the private sector in the assumption that the network perimeter approach to protection will keep data safe. Attackers can easily bypass known weaknesses on the perimeter and, once in, use various means to capture information. All governments should assume that their information will be under attack at some point -- be it from individuals or foreign powers. They need to take protection measures that protect the data from inside the perimeter, given the ease with which these weaknesses are exploited. Governments have this protection model in place already, but it is usually reserved for staff or physical assets, and involves additional layers of protection inside the perimeter. The government needs to give its data the same levels of consideration.
That they have temporarily suspended internal access to the internet for the Government departments would suggest that they need to choke off the malware from sending data out. Alas, once a site has got to this state it can be difficult to clean up. Real defense requires preventing the information "misuse" from being established in the first place.
Sunday 25 October 2009
Find a job and lose your identity
One of the key functions of job sites is to act as a trusted intermediary between those with jobs and those without – acting like a matchmaker. Discretion and confidentiality are taken for granted. Perhaps it is time for websites to make it clear to users that the site provides no guaranteed care of the sensitive information users are asked to entrust.
In the current job climate, it is careless for organizations to put any data at risk, let alone that of their customers’ future employees.
Friday 16 October 2009
Oh no, not again: Data breach phase two
Once criminals have access to an account via an authentication method, they can manipulate the data as though they were a trusted user. Many times, the activity is not caught until well after the breach or theft has occurred because the system is operating under the assumption that it is getting orders from an authorized user. What PayChoice points to is the need to have a granular view of what is going on with data at all points and for all transactions.
With the proper controls in place, PayChoice would have been alerted to suspicious activity – in this case, apparently adding false employees to payroll accounts – and had the ability to block it.
Wednesday 7 October 2009
The dirty little secret your bank may be hiding
The typical response – reduce access to sensitive data – is difficult to do in the financial services industry, in which access to customer and company information is a necessity to do most jobs. The answer needs to be broader and needs an accompanying change in attitude. Banks, like any organization, should assume that their data is under threat from insiders and should take steps to ensure their protection measures are in line with this thinking. Some examples would be blocking large amount of data downloads, stopping downloads during off-hours, and preventing certain types of changes. The technology is there and, unfortunately, today’s threat environment demands this level of protection.
In these tight economic times, organizations must not take extra risks by reducing IT security budgets.
Wednesday 30 September 2009
Are Clouds Compliant?
The questions we worked through were:
Q1. Do you trust the cloud?
Q2. Are clouds compliant?
Q3. Is compliant a barrier to adoption?
Q4. How should we make clouds compliant?
There was online polling of the audience with clear majority responses being as follows: Q1 – No; Q2 -No, and Q3 - Yes.
Question 4 was the most interesting for me. We actually debated two courses of action. First, "Change clouds to accommodate regulation". Alternatively, "Change regulations to accommodate clouds". A cheeky 25% of the audience voted for the latter! This is like raising the speed limit on the roads because it is impossible to stop motorists from speeding. Is this a good idea?
I cynically pointed out that the underlying context is not peculiar to cloud -- and is commonly observed in other computing architectures. IT is a business enabler – and businesses want to make profits. Once a profit making system is in place, it is only then that organizations get concerned about compliance and security issues. Alas, the elasticity and remote nature of cloud infrastructures make retro-fitting security devices (e.g. firewalls) nearly impossible. The only way to achieve the retrofitting of security into the cloud is if you can make the security technology ‘cloud hostable’ and have it inserted seamlessly into the underlying fabric. Perhaps a DataWall for the cloud – watch this space.
Wednesday 23 September 2009
Why hack a database when the data is being given away!
Imagine my horror to learn that "Demon's director of customer service" has emailed 3,681 of their customers and attached the list of user details for the 3,681 customers. This is not an attack to steal data – this is an appalling example of a data leak caused by “human error”.
In the paperless office empowered by an IT world this sort of thing is so easy to do. Imagine “accidentally” stuffing printed client lists in a paper-based mail-out to customers in the good-old-days of paper systems. Unlikely. Let us wait and see if there is any legal action. The U.K. legal framework lacks the teeth to really bite.
Data security goes beyond defending against malicious attack – it also must defend well intentioned fools.
Wednesday 16 September 2009
Data Breaches: The way to a corporation’s data-heart is through their applications-stomach
Last week the CEO of Heartland Payment Systems, Robert Carr, highlighted that it is not just web applications that have the flaws. The breach, that ultimately had more than approximately 130 million card numbers leaked from Heartland’s payment systems, was actually initiated through an unrelated corporate application. This too, was exploited via SQL Injection, allowing the attacker to use the database to get a “position” on the network from which undetectable-malware delivered a sniffer that was installed to collect passing card numbers from the card payment system.
Heartland had many penetration testers and certified security auditors (including PCI QSAs) constantly crawling all over their systems – even after they had learned of the injection attack. They had been reassured that their card data was still safe for many months. Alas, history tells us that they had a false sense of security – until they went looking for the sniffer based on lessons learned in the Hannaford Brother's data breach.
Now – like Heartland – the initial claim of RBS (owners of WorldPay) is that no data was leaked in this recent exploit. How long will it be before we learn otherwise?
Wednesday 9 September 2009
What’s in a number?
The issue of reporting and disclosure is hotly contested, oftentimes pitting the rights of individuals against corporations that want to distance themselves from the bad publicity and associated liabilities. The United States, which has seen some of the largest data breaches in history, still does not have a single standard for data breach reporting or regulatory data protection requirements.
We can’t expect companies to willingly disclose data breach information – the consequences are too severe, even though the full disclosure will work to their benefit over time. What needs to happen is the same focus on transparency that is being heralded in the financial services industry should be applied to data breaches, with the primary goals being catching those responsible and informing those affected as soon as possible. This transparency will come, at the very least because certain industries will require it. In the meantime, we take solace in the 71 percent of the Ponemon respondents in France, who placed data protection as a critical component to their overall protection plan. These companies are not completely overlooking data protection but they are playing catch up (as are most companies these days) to very sophisticated hackers.
French cuisine may be famous for rich sauces. It is clear from this report that they are a rich data source too!
Thursday 3 September 2009
Declaring war on easily attacked applications
In the Sears.com case, it was very poor application design which made the attack possible. Foolishly, the functionality that is meant to protect the web-application was deployed in the least trustworthy of locations, the “customer’s” web browser. On the internet, “customers” and “attackers” are indistinguishable. As Sears.com has no control of the “attacker’s” browser it has not reason to trust that this inadequate security control mechanism will not be altered, tampered with or completely disabled.
Further poor practice meant that there was no independent monitoring or enforcement system that prevented or even alerted the strange behavior of a massive increase in certain functionality that allowed a “customer” to “stage a brute-force attack that could grab all valid, active Sears and Kmart gift cards from the company's database.”
What should organizations do?
1. Raise the level of software engineering so that secure development processes are embedded in all application development.
2. Test, test, test, test, and re-test the application for vulnerabilities. Remember, these are vulnerabilities that are in applications that your developers wrote – not just operating system or platform component vulnerabilities.
3. Put monitoring and enforcement systems in place to fully understand what normal usage patterns are for the application and ensure that real-time policies can prevent unwanted behaviors.
Finally, organizations should make an open declaration of war on poorly produced and operated applications!
Tuesday 18 August 2009
The aftermath of the largest data breach ever
If we look at this breach as a clever group of renegades, we are missing the point. These breaches show the value our financial data holds and how little control we ultimately have over it. Before we get dazzled by the locations, methods and number of credit cards hacked, we should ask why the data was not encrypted or did not have other protection mechanisms in place.
This type of defect is all too prevalent in the low quality IT systems in which we blindly give our trust. We can be sure that the biggest breach is yet to come!
Saturday 15 August 2009
The University Data Breach Blues
The Open Security Foundation, a nonprofit that tracks data breaches, estimates more than 11 million records stored at US colleges and universities have been compromised. Many times, these breaches are not discovered until well after the data is lost. UC Berkeley, for example, found out about this current breach from an alleged hacker’s website.
We have entered a world in which personal data is always at risk from hackers who will grab and sell it for profit. Retailers and financial institutions have felt the pain of protection in this environment, and they have the latest technology as well as compliance measures for protection. What will universities do, since they do not have the same financial resources?
The answer could come in part from compliance guidelines, with government and the private sector working together to suggest best practices and protection measures. Doing so should allow graduates to enter the post-university world with their data -- and credit reports -- uncompromised.
Friday 7 August 2009
Easy-to-use or Easy-to-lose? Health-care call-centers sneezing over our private data
However, call-centers present a significant risk to data privacy. The bulk of the work in a call center is performed by low paid, low skilled telephone operatives in an industry where 30% annual turn over of staff is consider exceptionally low. One researcher suggests that in the banking sector in Scotland, the annual staff turnover is more like 80%. Worse still, police investigations have shown that call centers in some industries are routinely infiltrated by members of criminal gangs whose aims are to get copies of valuable data.
The original article acknowledges the insider and external threats, and states “Agents must have “easy to use” and reliable means to send and receive confidential…” information “ … inside the firewall, as well as outside.” We all know that “easy-to-use” systems and “secure-systems” rarely go together. With the low skills level and high staff turn over, my guess is that “easy” trumps “secure”. Perhaps we should associate "easy-to-use" with "easy-to-lose" (data).
Do you want your precious health-care record sneezed on and transmitted “un-healthily” around a call center?
Thursday 30 July 2009
Heartland Payment Systems: What went wrong and why we need the Transportation Safety Board for Data
Consider what would have happened if this were incident that had occurred in the airline industry. We would expect a team of accident experts from the TSB to perform a detailed open enquiry into the matter, come to strong conclusions about what went wrong and why. They would also disseminate all new knowledge and best practice to be mandated throughout the industry.
For a data breach we get is a webinar. Simply “candid” webinars with CEOs of companies who have been breached is just not good enough. We need publicly funded and accountable organizations to pick through the burning rubble of yet another data breach and force the industry to improve.
Openness and learning lessons from the mistakes of others is the norm in aviation. Aviation regulations are also much more rigorous and are often written through exceptions. For example Rule Zero: No one may fly. Rule One: ... except if you are a registered compliant airline. Rule Two: ... except …
The data world needs similar culture and framework. Perhaps we could start with– Rule Zero: no organization may hold data. What do you think Rule One: should be?
Monday 27 July 2009
Let the Script-Kiddies Loose – One click database hacking
The reality is that ‘yes’ databases are very vulnerable and that system's owners from database administrators to network security professionals should be taking action. However, the deployment posture of databases changes their threat landscape – especially for external attack. Very few databases are directly connected to the outside world as they are nearly always connected via-applications. So in this situation the standard 'pre-packaged' exploits can have limited efficacy.
It is well known that the application layers are the worst offenders for holding exploitable bugs and security issues. As each application has been built to do a specific job, each application has its unique set of security issues. With some access to the application, it does not take much effort from an attacker to get at data they should not. Web sites are simply applications that provide access to everyone – including external attackers. Databases are also threatened by insiders often using their own login account. Sometimes data is accessed inappropriately simply out of curiosity rather than out of malice whilst highly privileged users make accidental mistakes resulting in corrupted data.
So back to my question at the top – “should we panic” about script kiddies attacking our databases? No – but we must not believe our databases are effectively secured. We must take calm and consistent measures to pro-actively defend our databases from poorly written applications, nosy internal users, and the bumbling error-prone high-privileged administrator.
Friday 24 July 2009
Alico: the company is always the last to know
An Alico spokesperson said that the company has yet to determine how the data could have been leaked. This statement and the fact that credit card companies alerted the company to the breach shows how difficult it can be to determine how a breach occurred, even if you know that one did occur.
This breach brings to mind RBS WorldPay and Heartland, in which customers saw fraudulent charges on their credit card bills before the companies realized they had been breached. As Alico looks for the source of the breach, we are also reminded that in this threat environment, personal data is constantly under attack. The link to criminal elements shows that these breaches are done with the express intent to grab personal financial data to be used fraudulently. In this type of situation, Alico is left playing “catch up” without the ability to stop additional damage to its customers, because their data has already been compromised. We hope that the company and all in the industry use this as a lesson as to the importance of knowing the location and status of their data at all times because it will always be an attractive target.
As they say, "an ounce of prevention is worth a pound of cure" -- now is the time to apply preventative measures to protect data.
Wednesday 22 July 2009
The cost of losing customer data: (only) £3 million for HSBC
For all financial services firms, especially, this ruling should be given strong consideration. If these types of breaches can occur at the world’s largest bank and the Forbes’ ranked sixth largest business in the world, then they can happen anywhere. Details from the breaches show an alleged environment in which poor employee training in preventing and dealing with identity theft as well as lax encryption standards prevailed. These factors are part of “Identity Theft 101,” meaning that even the smallest, most regional bank, mortgage company, or insurance firm would make sure these controls are in place. There is some good news for HSBC in this. By cooperating, they have seen their fine reduced from a potential £4.5 million, savings that can be used for better protection.
The bad news for all of us is that the fine was really insignificant in the grand scheme of things. A few million pounds is still only loose change for these organizations, even in these times.
Thursday 16 July 2009
Why, even after Twitter, the Cloud is safe. Secerno weighs in
For Twitter these include names of senior executives who interviewed for positions at the company and are currently employed elsewhere, earnings projections, new product information, and floor plans.
What is interesting to those of us in the security industry is what the breach initially appears to be – a security failure in the Cloud, and what it really is – an exploit of the password recovery system and other features of Google Apps.
Cloud is no more secure or less secure than any other environment, and what happened at Twitter could have easily occurred in a traditional implementation. This breach indicates, at the very least, that traditional password protection practices were not being followed. This is not surprising considering the stress placed on current IT budgets that results in security updates and practices being delayed. For every organization that holds information that could be deemed embarrassing if made public (so, everyone), Twitter serves as reminder that open does not mean secure and the protection needs to come from provide the appropriate care at the level of the data itself.
Thursday 9 July 2009
Unification – an important lesson from Ponemon
The report finds a direct correlation between an organization’s likelihood to experience data loss and its lack of a consistent, organizational-wide strategy and enforcement of data protection and encryption policies.
Many of the organizations surveyed indicated that data protection was among its top priorities, but what caused many to fall victim to data loss was the lack of a unified, consistent approach to protect data that would apply across every access point and device in an organization. This approach will quickly fade over the coming months. Legislation, increasing awareness, and the requirements of standards like PCI-DSS will cause companies that have not undertaken a unified approach to consider it – strongly.
If outside pressures are not enough, then financial ones will be. Recent research by Ponemon found that the average UK data breach costs a total of 1.7 million pounds Sterling; the equivalent of 60 pounds Sterling for every record compromised.
These numbers are too costly for us all. Personally, I don’t care about the cost – I simply want my data held safely!
Wednesday 8 July 2009
Is North Korea really to blame?
The timing would indicate North Korea, which has been increasingly aggressive in its dealings with the United States over the past few weeks. However, it is doubtful that North Korea has the ability to launch this devastating an attack on such a large scale.
The question, then, is who would plot and execute this type of strategic hit at two major world governments, as well as some very well-known companies? The answer might be found in a series of cyber attacks that US and UK government organizations endured in the middle part of this decade.
At the time, both countries were complacent in their security measures, without realizing that their actions were being monitored by entities that launched extremely targeted attacks to penetrate their systems. It took two to three years before the details and those purportedly behind the attacks were revealed outside security circles.
Today’s situation is analogous in that it will take at least that long before we realize the “who” and “why” behind these attacks. Until then, the news reminds us how important and continuous our efforts to protect government and private data must be.
Monday 29 June 2009
The UK's cyberspace initiative
There are also economic considerations that this initiative addresses. In the UK, for example, more than £50 billion is spent online every year and 90% of high street purchases are made using electronic transactions.
There will be much debate as to the validity of the threats and the forms that they will take; however, as members of the security industry, we know that these threats always have the ability to be more devastating and widespread than even popular imagination can dictate.
By placing the protection of "digital Britain" in the hands of the government, we are showing a united front against cyber-criminals, cyber-terrorists and the run of the mill hackers who pose a threat to our information systems and personal data. As we commend the government for taking this bold and necessary step, we would like to remind them of a lesson that industry has learned over the past few years: threats come from internal and external sources. So, a "defend the perimeter" approach will leave valuable assets unprotected.
The government should look at the threat matrix holistically, starting from the databases that hold information, to the individuals that access it, through the networks that carry the data, to the perimeter. This "ground-up" approach will ensure that we are well protected at every turn.
Tuesday 16 June 2009
View from the Tower
Today, the TowerGroup has suggested the financial services industry stands on the losing side of the battle to protect consumer data. TowerGroup analyst George Tubin believes that the majority of data within financial services institutions has been or will be compromised, because proper data protection measures continue to be overlooked. With Heartland, RBS WorldPlay, Checkfree and BNY Mellon Shareowner Services making headlines with major breaches in recent months, it all suggests the industry needs to make data protection a higher priority.
Consumer anger, embarrassing headlines and the threat of legislative involvement have not stopped data breaches in the financial services industry and nor could they, sadly. In this turbulent economy, the last thing the industry would say it needs is legislative action or another protection standard to contend with, but they should take the four months as a very serious wake-up call if they are to avoid these outcomes. These companies need to re-evaluate how they protect and store data. With each breach, mandatory legislation becomes moves a step closer. The irony is that, as we have seen with PCI-DSS, these standards bring more cost and headache than protection. The industry cannot afford this on many levels. Any financial services firm that is not evaluating their data protection measures with a forward-looking plan in place, therefore, brings the industry closer to a mandatory protection standard.
Thursday 11 June 2009
Careless Talk - Part 2
Earlier this week, the Internet buzzed with rumors about a hack at T-Mobile when the alleged hacker posted information on the security forum Full Disclosure. T-Mobile has now confirmed that the posted information is from one of its documents, but it denies that the information was obtained through a hack and says that no customer information was compromised. This is great news for the company. It's even better news for their customers. But it also points to the most common threat to an organization’s data: the corporate insider.
We have no knowledge of how this information was obtained at T-Mobile, but in an industry that has many employees, contractors, third-party suppliers and partners all with access to a wealth of customer data, it should be no surprise that an insider is very likely involved. It was predicted. Telecommunications service providers have long taken the “defend the edge” approach to security, with a focus on keeping threats off the network. This makes is more difficult to monitor and block an insider from accessing information. For all carriers, assume that your data is under scrutiny from the inside as well as outside and take this week’s happenings as a call to action.