Gitex 2010 conference, Security awareness and update presentation for all IT professionals. Security is a continuous skilled process which requires constant update and refresh to be effective. Using the wrong assumptions will lead to disaster...
14. A.C. Nielsen
Agilent Technologies
Apple
AT&T – Possible
Macrovision
Baker & McKenzie
BBC
Bertelsmann Media
Boeing
Church of Scientology
Cisco Systems
Cox Enterprises
Davis Polk & Wardwell
Deutsche Telekom
Disney
Duracell
Ernst & Young
Fujitsu
Goldman Sachs
Halliburton
HBO & Company
Hilton Hospitality
Road Runner RRWE
Seagate
Sega
Siemens AG
SONY CORPORATION
Sprint
Sun Microsystems
Symantec
The Hague
Time Warner Telecom
Turner Broadcasting
Ubisoft Entertainment
Unisys
United Nations
Univision
USPS
Viacom
Vodafone
Wells Fargo
Xerox PARC
Hitachi
HP
IBM
Intel
Intuit
Levi Strauss & Co.
Lockheed-Martin Corp
Lucasfilm
Lucent
Lucent Technologies
Matsushita Electric IС
Mcafee
MetLife
Mitsubishi
Motorola
Northrop Grumman
Novell
Nvidia
O’Melveny & Myers
Oracle Corp
Pepsi Cola
Procter and Gamble
Random House
Raytheon
AFP published this untouched photograph of a Hurricane Katrina evacuee and her debit card. What happened next was no surprise
LAS VEGAS--Hackers competing in a social engineering contest at the Defcon conference here on Friday were able to trick random employees at 10 major U.S. tech, oil, and retail companies into giving them sensitive information over the phone that could be used in targeted computer attacks on the companies."Every single company, if it was a security audit, would have failed," Christopher Hadnagy, operations manager for Offensive Security, a training and penetration testing company, told CNET after the first day of the contest, which wraps up Saturday and targets BP, Shell, Google, Proctor & Gamble, Microsoft, Apple, Cisco, Ford, Coke, and Pepsi. "Not one company shut us down, although certain employees within the company did. But we (participants) were able to call right back and get another employee that was more willing to comply."The organizers declined to offer specific comments about any one of the companies targeted by the contest or say which companies are faring better or worse than the others. But they said they'd release a report with aggregated information in a few weeks."The point isn't to shame anyone. It's to bring awareness to this attack vector, which is probably the easiest way to hack a corporation today," said MatiAharoni, lead trainer at Offensive Security. "We really don't want to see anyone get harmed or get in trouble."Social engineering is a hacking technique that involves simply tricking people into offering up sensitive information, rather than using technical means--such as breaking into computer systems--to get such data. The contest's organizers said companies put a lot of emphasis on buying security software and building technological defenses for their information, but they ignore their Achilles heel: the people who work for them."The human resources are the weakest and softest spot of the whole organization," Aharoni said. "The most used vector by hackers today is the easiest route, and that's usually the human element."Each of the 10 contestants was assigned one of the target companies a week or so before the event and allowed to do "passive" Web research to gather intelligence on the target and figure out a plan of attack. They were not allowed to make social engineering calls or use phishing or other online methods to extract this information.The social engineering contest at Defcon targeted 10 major companies to see how easily a stranger could get information out of them.(Credit: Social-Engineer.org)At Defcon the contestants have 25 minutes to make calls to try to get as many bits of information from a predetermined list as they can. The calls are broadcast over a sound system. The contestant with the most items at the end of the event wins.Contestants are asked to get "innocuous information" about the corporations, such as what company provides dumpster service, whether it has a cafeteria, and what browser its employees use, contest organizers said.None of the employees at the companies was asked for or gave out any financial information, credit card details, personal data, or other sensitive information barred from the contest, according to the contest organizers, whose Web site is dedicated to educating people about the dangers of the social engineering technique.Only three people out of 50 or more employees who answered the phone calls, were skeptical and hung up without providing information, and all three were women, said Hadnagy."One woman said 'this question sounds fishy to me' and hung up within the first 20 seconds," Hadnagy said. "We all clapped."In another case, one hacker got answers to nearly every question on the list of 30 to 40, plus information that wasn't part of the official list, according to Hadnagy."People went as far as opening up their e-mail clients, Adobe Reader, versions of Microsoft Word, and clicking on 'Help/About' and giving the exact version numbers of their software," said Aharoni. "For an attacker, the exact version number would provide a much higher level of success," allowing an attack to be tailored to exploit a vulnerability in that exact program.The contest made ripples even before it officially began. After hearing about plans for the event, the FS-ISAC (Financial Services-Information Services Analysis Center) issued warnings to companies to be alert during Defcon. The contest organizers reached out to the agency and offered to work with it to educate and train people about recognizing and preventing social engineering attempts.Meanwhile, several agencies in the U.S. federal government have expressed interest in the group's report when it's done, according to Hadnagy. He declined to identify the agencies."We will share information with law enforcement as they've asked of us," Aharoni said.Read more: http://news.cnet.com/8301-27080_3-20012290-245.html?tag=mncol;1n#ixzz12VpVSrq2
Google Inc. (NASDAQ:GOOG) had removed a phony Twilight-related application from any Android mobile phone that downloaded it using a “kill switch.” In a talk at the SummerCon event, security researcher Jon Oberheide created a pair of Android applications to show how easy it is to infect a large number of phones that run on the Android OS. Jon used hidden software that turns devices into a “botnet” through a fake Twilight Eclipse that promises pictures from the upcoming movie.Android security lead Rich Cannings used a “remote application removal” option, or “kill switch” for the Android Market for the first time to remove the application. The fake Twilight Eclipse botnet applications were mostly deleted by the 300 people that downloaded it, not finding what they were looking for.What makes Oberheide’s case interesting is that he proved Android needs more scrutiny. If the fake Twilight app had pictures in it, then Android users would not delete it. And if he did not discuss the Android vulnerability at the SummerCon event, Google would not use their “kill switch.”
CEOs and the technologists who work for them like to say the applications they rely on— especially the kind custom-written by specialists at banks and investment companies with fortunes behind them—are safe as houses.And they are, if you're talking about houses in Louisiana when the Gulf starts lashing hurricanes and tarballs.Almost 60 percent of all the applications brought to security testing and risk-analysis companyVeracode during the past 18 months couldn't meet the minimum standards for acceptable security, even when the criteria were dialed down to accommodate applications that don't pose a great security risk, according to Samskriti King, vice president of product marketing at the company. Web-based apps carry their own special set of risks."There are far more people on Web projects because they're often easier to develop; many components are already available so you can stand up Web applications very easily," King says. "Developer education usually focuses on applications generated and used in one place, but Web applications could touch many places, so a vulnerability in one component could manifest in many places if it's reused."Unfortunately, developers trained with software that's generated and used in one location with a single set of servers often don't understand the precautions needed for Web applications that take code, data, and elements of the interface from many servers, she says.[ For more background on securing Web-based apps, see 5 Problems with SaaS Security . ]The typical number of security flaws, especially in legacy or other homegrown software, must be taken into account by cloud-computer service providers, says Thomas Kilbin, CEO of cloud and hosted-server provider Virtacore Systems. After all, he says, customers who want on-demand compute capacity don't want to rewrite all their applications just to run in an environment designed to save money and add convenience."Our customers are taking apps they had running in their back office and moving them to private clouds for the most part," Kilbin says. "They are not developing any apps geared towards only working in a cloud IaaS/SaaS model. We secure these apps via a number of methods, traditional firewalls, app specific firewalls from Zeus, etc."Keeping Web-based apps secure can be particularly tough for smaller IT teams."The cloud model is more threat-rich than the shared hosting model, mainly because in shared hosting the core OS and apps—php, perl, mysql—are kept updated by the service provider," Kilbin says. "In the cloud, the customer has to keep the core OS updated, along with the application stacks, in addition to their code."Most customers don't have the expertise or the time to do so, Kilbin says.Some 2,922 applications were examined by Veracode in the past 18 months, with the results detailed in the company's recently released State of Software Security Report: The Intractable Problem of Insecure Software.Some of the applications sent to Veracode for testing come from ISVs or corporate programmers in the last stages of development. Another big chunk comes from developers who have to present certifications or risk analyses before closing a deal with government agencies or heavily regulated industries.Old App Flaws Revealed Before Web MovesIncreasingly, however, Veracode is testing software that clients have used for a long time or are very confident in, but are now migrating to a cloud or Web-based service environment. The requests often come from corporate IT executives who turn out to be wrong in believing that their secure, homegrown applications are either homegrown or secure, especially when they're moved into multi-site environments for the first time.Both commercial and open-source applications failed Veracode's tests more often than homegrown—at 65 percent and 58 percent respectively. Homegrown applications failed 54 percent of the time, Veracode reports.Software written by outsourcing firms missed the mark an astonishing 93 percent of the time, Veracode says.Even applications being used by banks and financial service companies failed 56 percent of the time on initial submission, though the criteria are tougher for those applications, because problems in those apps would create more havoc than, say, in an internally developed server-monitoring application, King says.Internal developers shouldn't be comparatively complacent, however, King says. Though internal apps are generally assumed to be made of 70 percent homegrown code, reuse of code, objects and procedures is so common that between 30 percent and 70 percent of the code in homegrown applications actually came from commercial software.Internal developers are also unaccountably unaware of the most common exploits likely to be used against Web-fronting applications, resulting in an 80 percent failure rate for Web applications, which are tested against the list of 10 most-common security threats published and publicized by the the Open Web Application Security Project (OWASP), King says."At that point it just comes down to developer education," King says.Cross-site scripting is the most common security flaw in all the types of software Veracode tests, but is most noticeable in Web- and cloud-based software, King says.But the time it takes to fix problems and get an application to an acceptable level of security has dropped drastically from 30 to 80 days a year or two ago to only 16 days now, mainly because developers of all stripes are putting greater emphasis on security, software quality, and shortening their time to market, King says.There aren't any shortcuts, but Veracode does have some suggestions for IT teams to counter the most consistent app security problems:1. Design apps assuming they'll link cross-site; secure those links and the processes that launch them.Cross-site scripting (XSS) accounts for 51 percent of all vulnerabilities, according to Veracode. Apps written in .net have an abnormally high number of XSS issues because many .net controls don't automatically encrypt data before sending or storing it. Check and encrypt all points of output. Inadequate or absent encryption in non-.net applications also created problems, but are easy to fix once the source of in-the-clear data broadcasts are identified.2. Focus your efforts on the greatest source of vulnerabilities.You can assume software from any provider is likely to have vulnerabilities, but put extra Q/A and security analysis effort into code from outsourced programming services, ISVs and components from either of those that find their way into homegrown applications.3. Verify security of the application itself in a cloud or SaaS environment.Whether the customer or the service provider supplies the application, check it for flaws or vulnerabilities in a realistic cloud/SaaS/shared-resource environment, not just in a workgroup on a LAN. Security in the cloud platforms is still evolving, and the skills to write secure code for them is not widespread. Stick extra red flags on this part of your project plan.4. Location is irrelevant. New criteria are impact, impact, impact.A printer-management application with a flaw that allows hackers to draft a LaserJet into a bot army can cause headaches. An accounting, customer-data-management or cashflow-automation app with a backdoor can put you out of business. Use Level of Risk as a multiplier to determine how important a particular app is to evaluate, and how much time or money you should spend getting it fixed.5. Don't ignore the basics.The 10 most common attacks on Web applications are listed here by OWASP. The 25 most significant security errors that appear in applications are listed here. They're easy to read and come with extra help to fix or avoid errors already known by everyone who might want to hack your systems.
Windows has grown so complicated that it is harder to secure. Well, these images make the point very well. Both images are a complete map of the system calls that occur when a web server serves up a single page of html with a single picture. The same page and picture. A system call is an opportunity to address memory. A hacker investigates each memory access to see if it is vulnerable to a buffer overflow attack. The developer must do QA on each of these entry points. The more system calls, the greater potential for vulnerability, the more effort needed to create secure applications.“