Quantcast
Channel: Symantec Connect - ブログエントリ
Viewing all 5094 articles
Browse latest View live

Presentation from Houston Endpoint Management User Group Meeting - Jan. 17, 2013

$
0
0

Here is a slide from Bryan Rhodes with the information on how to get access to the ITMS 7.5 preview code. We will attach other presentations from the meeting once they become available, thank you.


Enterprise Vault 10.0.3 Released

Symantec Intelligence Report: Diciembre2012

$
0
0

En el Reporte de Inteligencia de Symantec de diciembre damos un vistazo al panorama mundial de amenazas a la información. En el último mes del año,  Estados Unidos se ratificó como la mayor fuente de spam con un 12.7% a nivel mundial, de engaños producto de phishing con un 24.2%, y virus adjuntos al correo electrónico con un 40.9% a nivel global. Es común que los Estados Unidos ocupen el primer o segundo lugar en estas tres categorías, sin embargo, el que sean los líderes rotundos de las tres categorías resulta un tanto inusual.

Con respecto a otros países, Noruega se ha convertido en el país más acechado con respecto a ataques de phishing. Symantec ha detectado que, en diciembre, 1 de cada 81.4 correos en esa nación fueron ataques de phishing. Noruega también es considerado como el segundo país fuente de ataques, ya que distribuye el 20.2% de todos los ataques de phishing que se registraron en el mundo. La razón por la cual una nación puede tener cifras como ésta pueden ser diversas, pero quizá se deba a una campaña de phishing bien coordinada por parte de los atacantes.

En otras noticias, la categoría sexo/citas dominó  el tráfico de spam a nivel global con 82.6% de todo el spam generado en el último mes. Esta categoría regularmente ocupa el primer lugar, y supera al spam farmacéutico, pero el margen entre ambos es raramente tan grande. La industria bancaria también fue otra vez el sector más utilizado en los ataques de phishing con 65% de los ataques en diciembre.

Finalmente comentamos que, el mes pasado, más del 80% de todo el adware estuvo marcado por detecciones genéricas. Esto tal vez no parezca tan interesante, y generalmente Symantec atrapa o clasifica muchos de estas amenazas bajo detecciones genéricas, pero en muy pocas ocasiones el porcentaje es tan alto como en esta ocasión, lo que indica que los creadores de adware no están intentando nada nuevo o único, sino que tal vez simplemente se tomaron un descanso durante la época de vacaciones.

Reporte completo en PDF:  Symantec Security Intelligence Report - Diciembre

Adobe Privilege Exploitation in 2012

$
0
0

With the new year upon us, it’s time for Arellia’s 2012 analysis of Adobe Security Bulletins and those with privilege exploits. As a refresher from the Introduction on Privilege Exploitation, privilege exploitation is where the malicious software takes advantage of the rights of the logged in user to change the configuration of the local computer.  Breakdown of Adobe Bulletins:

Bulletins28
Vulnerabilities125
Bulletins with Privilege Exploitations20
Vulnerabilities with Privilege Exploitations98
% of Bulletins with Privilege Exploitation71.43%
% of Vulnerabilities with Privilege Exploitation78.40%

 

Further analysis of the vulnerabilities with privilege exploitation by Adobe software component is as follows:

Software

Vulnerabilities

Adobe Flash Player48
Adobe Reader25
Adobe Photoshop7
Adobe Illustrator6
Adobe Flash Professional1

Adobe Flash Player and Adobe Reader are two of the most commonly installed applications on desktops and Internet browser plugins. These applications allow for rich user experiences but also contain some of the most commonly exploited privilege vulnerabilities. These vulnerabilities are often exploited on malicious websites or webpages, but can also be exploited by downloading content and running it on your computer. Either way, privilege management can mitigate Adobe vulnerabilities.

Adobe Reader is currently being exploited as mentioned in the NakedSecurity Blog bypassing Adobe’s sandbox. This should be troubling news to all businesses using Adobe because the exploit works on Adobe Reader X and Reader XI, which is the most current version. Application privilege management can assist in further locking down applications against such exploits.

Privilege management can be implemented in one of two ways. First, one could move users from administrator accounts to standard user accounts. This can create some additional challenges around applications that require administrator rights – a challenge that can be addressed with privilege elevation using software such as Arellia Application Control Solution. The second and better option and one that is much easier to implement on any user is to remove privileges from commonly exploited applications as was illustrated in Zero Day Vulnerability Protection with Privilege Management.

Arellia Application Control Solution and Local Security Solution provide application privilege managementand user privilege management for securing Microsoft applications against privilege exploitation. Use these as an additional line of defense against common exploits.

About Arellia: Arellia provides solutions for privilege management, application whitelisting, securing local administrator accounts, and compliance remediation. Arellia products are integrated with the Symantec Management Platform and sold through Symantec.

Mozilla Privilege Exploitation in 2012

$
0
0

 

Internet browsers are perhaps one of the most exploited applications because of all they are capable are doing with the internet. For 2012, Mozilla’s Thunderbird, SeaMonkey, and Firefox had the following security advisory bulletins:

 2012
Bulletins106
Vulnerabilities152
Bulletins with Privilege Exploitations53
Vulnerabilities with Privilege Exploitations93
% of Bulletins with Privilege Exploitation50.0%
% of Vulnerabilities with Privilege Exploitation61.2%

 

While not the case in 2011, for 2012 all of Mozilla’s Security Advisories applied to Firefox. So this means that there were 93 out of 152, or 61.2%, of total vulnerabilities that had privilege exploitations. As we learned from the Introduction on Privilege Exploitation, privilege exploitations are the most dangerous type of vulnerability because they allow for changes to computer configuration using the rights of the logged in user.

This past year Mozilla has steadily rolled out updates every 2-4 weeks for Firefox. This means that if a user does not update to the latest version they could be left vulnerable to 3 or 4 vulnerabilities with privilege exploitations on average. Most businesses understand the risk of not updating to the latest version, so they make sure to constantly update their software.

As we saw with the Zero Day Vulnerability Protection with Privilege Management blog, zero-day threats are not mitigated by updating to the latest version of software, because the latest version does not exist. However using application privilege management does mitigate zero-day vulnerabilities because it removes rights and privileges from the application. Using application privilege management is even more secure than a standard user.

So that leaves the question – with 93 vulnerabilities with privilege exploitation and the potential for zero-day threats, will you leave your environment exploitable? Or will you use Arellia Application Control Solution toremove application rights from Mozilla Firefox and Arellia Local Security Solution to manage user privileges? The answer is clear; protect your environment today by using Arellia as a privilege manager for applications and users.

About Arellia: Arellia provides solutions for privilege management, application whitelisting, securing local administrator accounts, and compliance remediation. Arellia products are integrated with the Symantec Management Platform and sold through Symantec.

Servicedesk Incident Resolution Time is Very Slow

$
0
0

Recently, a Servicedesk 7.1 Sp1 user encountered a very perplexing problem.  When resolving incidents, the amount of time required for the process viewer to refresh and show that the incident is resolved or closed is normally a short time (2-3 mins).  However, this users system was not showing the incident resolution until anywhere from 14 - 40 minutes after the Close Incident task was completed.

Thru SQL profiler trace, and Info level log review, it was discovered that this issue surrounded incident escalations.  Further analysis showed that in the SD.IncidentManagement project, in the Initial Diagnosis Model, under the Event Configuration tab of  the Initial Diagnosis dialog workflow component, in the Escalation Configuration section,  there was an escalation named “Scheduled”.  It has a decision model with 40 elements in it, and was running every 10 minutes.  It was designed to be sent out for a scheduled re-opening of incidents that had been put on some form of hold.  It was an older process and, for the most part does not get used, and was removed in the 7.1 Sp2 versions of Incident Management.

 This is what was slowing things down.  In situations where there are large numbers of open incident processes, this ended up running over the top of itself.  

the way to bring speeds back up to expected performance is to remove the "Scheduled" escalation from the Escalation Configuration.  Save the project and republish.

As always, back up your projects and systems before making these types of changes.

Exclusive Offers and Important Announcements for Microsoft MVPs

$
0
0

 

1. Get Your FREE copy of the Backup Exec 2012 V-Ray Edition

If you are a Microsoft MVP (Most Valuable Professional), you can get a fully licensed copy of the Backup Exec 2012 V-Ray Edition along with 12 months of essential technical support completely FREE.

Powered by V-Ray technology, the Backup Exec 2012 V-Ray Edition gives administrators a wealth of recovery options including the ability to recover at the virtual machine, virtual disk, application, file/folder and granular application data level – all from a single pass backup. It also provides advanced deduplication capabilities and physical to virtual conversions. The V-Ray Edition gives you worry-free backup and recovery for virtual machines and virtualized applications.

Register for your free copy today at www.backupexec.com/MVP!

 

2. Get a sneak peek into the next release of Backup Exec

The Backup Exec team is excited to announce registration for the next beta release of Backup Exec 2010 and 2012! This beta will focus primarily on support for Windows Server 2012, Microsoft applications, and virtualization, including the new version of Microsoft Hyper-V.

If you are interested in testing, validating and actively providing feedback on this new Backup Exec code within your lab or production environment, click here for more information and to access the registration link.

Forward-looking Statements: Any forward-looking indication of plans for products is preliminary and all future release dates are tentative and are subject to change. Any future release of the product or planned modifications to product capability, functionality, or feature are subject to ongoing evaluation by Symantec, and may or may not be implemented and should not be considered firm commitments by Symantec and should not be relied upon in making purchasing decisions.

Info-Tech Ranks Backup Exec a Champion

$
0
0

"Fools learn from experience. I prefer to learn from the experience of others."
- Otto von Bismarck

This year I expect to meet a lot of learned fools. IT managers face the bruising lessons of new rollouts as new operating systems, hypervisors and applications come to market. Bismarck's wait-and-watch alternative means a competitive disadvantage for your company and fading relevance in your job skills. 

Backup Exec is one answer to this dilemma, and today the Info-Tech Research Group endorsed it further by placing Backup Exec in the leading position of their Champion quadrant. We've long heard from customers that a single platform for backup and recovery is far more comforting than several niche specialist tools. Physical and virtual, tape and disk, hypervisor vendor A and vendor B, application v1 or application v2 - BE 2012 handles it all so you can't  get high-centered by a mistake.

Info-Tech's summary of the challenge is summed up in this quote from the report.

Currently more than 35% of organizations manage two or more backup solutions. However, most organizations are not 100% virtualized and more than half (52%) are still using tape. Over 50% of organizations manage more than one hypervisor in their virtual environment, making it critical to understand what vendors support.

As Bismarck implied, smart IT managers watch their colleagues experience to avoid the pain. Backup Exec protects more Windows environments than anyone else on the planet. Why risk this year's infrastructure migrations on a fragmented cluster of point products?

 

 Info-Tech Research Group Research Document Vendor Landscape, Virtual Backup Software, 2013

 


What is the Enterprise Vault Outlook Add-in?

$
0
0

 

One of the things that is mentioned a lot on these forums is the Enterprise Vault Outlook Add-in.  But what is it?  

In a nutshell the Enterprise Vault Outlook Add-in provides a number of abilities to Outlook users: 

Possibly the most common thing that it allows you to do is to retrieve items that have been archived.  The Add-in is closely integrated with Outlook so when you double click on an item which is archived Outlook passes control to the Add-in and the Add-in then performs a number of actions which eventually include displaying the retrieved version of the archived item.

It has other actions such as archiving, restoring, deleting, cancelling, PST migration, searching and an Archive Explorer component which allows the archive to be viewed along with it's structure.

The Add-in is governed by the Enterprise Vault Desktop Policy associated with the users mailbox and archive.  This policy controls which options are available on the Outlook ribbon (Outlook 2010 and higher) or the Outlook toolbar (Outlook 2007 and lower), and it controls other things like Vault Cache and Virtual Vault.  They too are part of the Outlook Add-in.

Until just recently there were two types of Outlook Add-in, a 'full' Outlook Add-in (also known as the DCOM Add-in) and a 'light' Outlook Add-in (also known as the HTTP Add-in).  The Add-ins offered slightly different functionality, with some things not being possible in the HTTP Add-in (such as viewing the Enterprise Vault tab on the properties of a folder in Outlook).  The HTTP Add-in was needed though if you wanted to use Outlook 2010.  There were also different versions of the Add-in for different languages.

With Enterprise Vault 10.0.2 all this has changed, and finally been drawn together in to one Add-in.  Just one!  Now all you need is to install that one MSI file on any supported version of Outlook with any (supported) language.

Enterprise Vault - Getting Started

$
0
0

 

It occurred to me just the other day that my blog has a whole variety of information on it relating to Enterprise Vault, but quite a bit of it is at a deep technical level.  I mean, there is nothing which covers the ‘basics’.  There is nothing to say ‘Where do I go to get started?’.

I’m going to correct that in a series of blog posts under the ‘Getting Started’ umbrella.  They will have a tag assigned to them, so you can find them quickly.  Some of them will also go in to a bit of technical detail, but for the most part I will (attempt to) keep them light.

 

So try this link, hopefully it'll show up in search results soon :)

https://www-secure.symantec.com/connect/blog-tags/getting-started

SharePoint Backup using NetBackup SharePoint Agent fails with status 2.

$
0
0

SharePoint Backup using NetBackup SharePoint Agent fails with status 2.

The SharePoint backup fails with status 2 mainly because of configuration issues. There is a checklist that we should adhere to while doing a SharePoint backup.

1. The NetBackup cilent service is being started by a domain account

2. The account running backup should have the following privileges.
-  ”Replace a process level token”
-  ”Debug programs”
-  ”Log on as a Service”
-  ”Allow to logon Locally”

These are set via (administrator tools – Local security policy – Local policies – User rights assignment)
This needs to be done for all servers in the SharePoint farm, including SQL back end.

3. The account should be Local administrator rights on all servers in the SharePoint farm.

4. .Net Framework 3.5 at minimum for SharePoint 2010 and  .Net Framework 2.0 for SharePoint 2007.

5. The client in the policy should be the one running Central Admin Service in the SharePoint Farm.

6. In the host properties of master server, go to windows clients->SharePoint and add the SharePoint Farm Admin account there.
This account is used to launch SPSWrapper.exe process which talks to SharePoint.

7. If you are running windows server 2008, UAC (User Access Control) should be disabled.

8. In Internet Explorer options disable “Check for Publisher Certificate Revocation” and “Check for Server Certificate Revocation”.

You can also refer to the following tech note by Symantec.

http://www.symantec.com/business/support/index?page=content&id=TECH146218

Distributed Application Restore Mapping for Restore using SharePoint Agent for NetBackup.

$
0
0

Hey Guys,

In this new post i am going to talk about the Distributed Application Restore Mapping. This feature allows us to restore content to any server in the SharePoint Farm. Traditionally the backup images are cataloged under FE server. To enable Netbackup to restore to other servers in the farm, we must specify the Frontend and Application Servers mapping to Backend Servers in the Distributed Application Restore Mapping dialog.

There are two Columns in that Dialog ” Application Host” and “Component Host”. The Application host column should contain the names of the Frontend Server and Application Servers and the Component Host column contains the SQL Backend Server names. If SQL is clustered then the Virtual Name of the Cluster is specified in this column.

For Example – Consider the SharePoint Farm having servers 2 Application Servers APP1, APP2), (1 Front End Server FE) and (1 SQL Backend – BE) then the Distributed Application Restore Mapping for this would look like

Application Host                              Component Host
App1                                                           BE
App2                                                           BE
FE1                                                             BE

This is also very essential in case of Redirected Restore of a Web Application to another farm. In this case the Distributed Application Restore Mapping should contain the mappings for Alternate Farm as well.

This can be set on the master server NetBackup Management > Host Properties > Master Servers->Distributed Application Restore Mapping.

The Challenges of the IT GRC Maturity Curve

$
0
0

Whenever a company embarks upon its journey along the IT GRC maturity curve it will invariably face the same challenges that any other company does.

However, the move from a compliance centric approach to a risk centric approach can be made that much less daunting if this process of evolution is broken down into smaller, more manageable, chunks.

Generally, the appetite for such a change in approach is a move from being driven by External Mandates to one that is driven by the needs of the business. At the lower end of the curve the focus is on passing audits and often utilises manual methodologies i.e. collecting information on spreadsheets, questionnaires or SharePoint. At the higher end the measurement is more focused on performance and risk as part of a continuous improvement.

It’s a path I have trodden myself some years ago when a previous employer moved to a more risk focused approached to IT Security.

The first step at the compliance-centric end is a realisation that it is no longer possible to keep up with regulatory and audit requirements while continuing to use the manual tools at your disposal. These invariably consist of check box driven processes and procedures and gathering the controls manually. The growing number of mandates and the need to increase the frequency of assessments quickly reach a point where that methodology is prone to error and distraction and is no longer viable, efficient or cost effective. There is a need to automate.

Automation of this process will require a new tool set. Things to consider as part of an overall solution may include out of the box regulatory content, platform coverage, ability to absorb procedural controls, risk management capabilities, vulnerability assessment, ability to digest 3rd party data, and finally, scheduled reporting and dashboarding capabilities.  Once the product is selected and implemented the first step is to select a suitable out of the box technical standard to measure against and the next stage in the journey is now well underway.

The next challenge to address is Prioritization. Selecting an out of the box technical standard allows you to get up and running and gathering vast quantities of data in a very short space of time but does have its drawbacks. With all the data that is now being collected, effectively and efficiently, there is a need to identify and more importantly, prioritize security issues that need to be fixed to ensure IT resources are being focused on the things that matter the most.

Part of the same challenge is how Information Security communicates the audit findings or remediation requests to IT Ops. There is no point in simply raising a series of never ending Help Desk tickets or issuing a 300 page ‘paperweight’ of a report.  This result will leave you still sitting towards the lower end of the maturity curve, albeit slightly further on that when you started but still with no clear view of where your risks and the priorities lie. However, if you use risk to filter and prioritize the actions of the IT Operations team then you are clearly moving towards the top end of the curve.

With this in mind, you should now be looking at building a sustainable program that develops the use of the product that has been procured. It should be one that will allow you to quickly adapt to new regulations, changes in the threat landscape and, importantly, refine the technical standards that are used and consequently the data they collect from the targeted environment. This refinement could take the form of customisation of the out of the box technical standards so they are more in line with the business needs or, alternatively, developing in-house  custom technical standard from scratch. The approach that is chosen will generally be governed by the maturity and size of the Company itself. Financial Institutions are a prime example of this. They have developed their own internal policies and technical standards and have employee’s whose prime responsibility is to maintain and develop them. In my last year in such a role, an in-house technical standard was developed using our toolset of choice. It was aligned with the paper-based platform standard and could perform a full Security Acceptance Test on a server and deliver extremely accurate results in less than 20 minutes. If there were more than 15-20 findings in the report then it was fairly certain that the server in question had a) not been built using the Gold Build media or b) settings had been changed after it had been commissioned. This was a huge difference from the 2 days timescale and error prone process that had existed just a few years previously. It was by no means an overnight change but one that had been refined and develop gradually.

By this stage you should be looking to build security into everything you do. Approaching it in a systematic, sustainable way rather than creating one off events, trying to get ahead of the curve and addressing weaknesses proactively. You should also be exploring programs that are able to deal with other aspects such as vendor risk management or find that the toolset already procured has that capability built-in or available in the form of an additional module.

At the top end of the maturity curve you will be able to demonstrate the relevance of security to the broader business. Information Security is no longer silo’ed but is regarded as relevant and engaged with the rest of the organisation. It’s no longer a last minute thought in a project but instead becomes a fundamental playing an advisory role assisting the Business Units  invest their money where it make most sense to protect the key and critical assets while also addressing the risk appetite. Depending on the approach taken during the evaluation and procurement process you may find that you already have Risk Management functionality available to you.

It is well worth remembering that this is by no means an overnight transformation but one that can take a number of years to reach maturity. Finally, it should be noted that while many aspire to reach this goal very few organizations do and find instead that they address all the requirements somewhere in the top half of the maturity curve.  

Cloud Encryption – Who’s Really Responsible?

$
0
0

There's a growing buzz in the industry about "who" should be responsible for encryption in the cloud from a user perspective.  As usual, the technology to do this is not the hard part – crypto is crypto is crypto, etc.  It's really more of a privacy and legal issue; privacy from the perspective of preventing others from seeing your stuff in the cloud and legal from the perspective of who has control over that data that is secured in the cloud.  
 
I think we all get the idea of privacy of our data in the cloud.  For example, if you put your personal financial data in the cloud to either be stored and/or used by an application, you want to make sure the data is secure.  If it's just storage, then you can personally encrypt the data before you store it in the cloud using encryption solutions like PGP.  If you're lucky enough to have a cloud provider that encrypts it for you, but gives you complete control over the encryption keys such that the provider doesn't have any encryption keys, then that's much easier and better than buying your own encryption technology and likely faster.  These are the simple approaches, but, in the case of the second example, may not be the easiest to achieve or really understand.
 
Here's where the legal issues start getting a bit interesting.  This has to do with who owns the encryption keys (cloud provider or user) and what does this give the owner the right to do with those keys and the data that is protected.  In the examples above, it's pretty straight forward.  The user owns the encryption keys and there's nothing the cloud provider can do with your data, short of deleting, archiving it, or locking your account.  At the end of the day it's just a jumbled blob of data.

If, however, in the example above, the cloud provider controls the encryption keys then the cloud provider has control over your data as well as you do.  In fact, the cloud provider has more control since the provider can change the encryption keys at any time, potentially view your data without you knowing or, worse, locking your account and removing your access to the data. You now have your personal sensitive data in the cloud that is available for misuse, sale to third parties or, if a hacker gets into the cloud provider systems, they can probably find the keys and do what they want the data.
 
The other interesting dimension (or, confusing) is where law enforcement comes into play.  If, for some reason, law enforcement determines they must have access to the data stored in the cloud and the cloud provider has the encryption keys and your account password then they would be potentially compelled to give them access to any and all data.  Keep in mind, this may not have anything to do with you the individual, but could be potentially a broader need by law enforcement to search for something they need in order to discover who is at the other end of whatever case they’re working on.  In any case, you may not have control over who’s passing out or providing access to your sensitive data.  Now your sensitive data is potentially in the hands of folks outside the cloud provider and either is or isn’t adequately protected.
 
Clearly, what I've described above is a worse-case scenario, but when it comes to protecting our own personal sensitive data, we all need to consider what the worst case scenario would look like.  Over the years we've seen a lot of theoretical security scenarios become reality since the advent of world-wide use of the Internet and the ability to share data.  The lesson here is that if you're storing your own personal sensitive data with a cloud provider; make sure that only you own the encryption keys to your data.

Android.Exprespam Potentially Infects Thousands of Devices

$
0
0

Android.Exprespam was discovered at the beginning of January and has only been around for about two weeks, but the scammers seem to be having a lot of success with the malware already.  Symantec has acquired some data that has allowed us to get an idea of how successful Exprespam may be in scamming Android users into providing personal data. The data obtained, which is only a portion of the complete data, indicates that the fake market called Android Express’s Play has drawn well over 3,000 visits in a period of a week from January 13 to January 20.

Based on several sources*, I calculated that the scammers may have stolen between 75,000 and 450,000 pieces of personal information.

Figure 1. Potential amount of stolen information

The scam has only been around for about two weeks so I am sure that this is just the beginning for the scammers and the amount of personal data collected will increase exponentially. As proof of this, we have found yet another domain registered by the creators of Exprespam and they also created another version of their fake market on the new domain. This time, they have decided to not give the market a name or provide the name of the party maintaining the market. At the time of writing, the new market does not appear to be in active use yet and may currently be under construction or on standby but that has not stopped the scammers as a new malware variant is already being hosted on the site.

Figure 2.Various fake app markets used by the Exprespam scammers

As you can see through the series of Exprespam blogs I have written, the scammers  are constantly modifying their tactics so that the scam provides a good “return” for them. These updates will not end until the scammers either are caught by the authorities and punished or cease scamming people, which is unlikely to happen anytime soon. By now, hopefully most readers who have been following this blog series are now familiar enough with this scam to avoid downloading and installing this malware.

Android users can stay safe by avoiding links in emails you receive from unknown sources, by downloading apps from well-known and trusted app vendors, and by installing a security app, such as Norton Mobile Security or Symantec Mobile Security, on the device.  For general smartphone and tablet safety tips, please visit our Mobile Security website.

* To estimate how much personal information may have been stolen, I combined the number of visits to Gcogle Play, the original fake market for Exprespam, and the new market. I am guessing the number of visits to Gcogle Play to be 2,000 as this site was live for the same number of days as Android Express’s Play.  I then calculated the number of contacts on average in each compromised device by taking the total number of contact details stolen by the malware, Android.Dougalek (aka the Movie malware) and Android.Ackposts, and then dividing it by the total number of infections (according to media reports, Dougalek stole about 11.8 million pieces of personal data from 90,000 devices, and Ackposts stole about four million pieces of personal data from 18,000 devices). That figure is 150 pieces of personal information per device.

To arrive at a conservative estimate, I assumed that only a small number—one in ten—of visitors may have actually downloaded and installed the malicious app for a total of 500 infections.

Conversely, if I assume that the number of users actually downloading and installing the app after visiting the site is about 3,000, we arrive at a much larger figure. Both calculations are shown in Figure 1.

I would like to note that this is not the number of unique contacts stolen. Furthermore, these numbers are just estimates to give a better understanding of the scale of the scam. As we do not have the complete data, the actual number is more than likely greater than my estimates.


ServiceDesk 7.5 Webinar January 24 - Join Us!

$
0
0

Attend this webinar to learn about the new features and capabilities for configuration now available with ServiceDesk 7.5 for quick, effective remediation of end-user incidents, system problems and change management.

Topics / Agenda:

Incident Classification
Service Queues and Routing Rules
SLA Management
Automated Notifications
Rulesets
Best Practices

WebEx details will be sent prior to the webcast event.

Register here: http://bit.ly/VhId6o

Introducing Enterprise Vault Office Mail Apps for Exchange and Office 2013

$
0
0

In previous versions of Exchange OWA, Enterprise Vault has delivered its integration with OWA by dynamically modifying the web content served up by Exchange, an approach that was fraught with issues around setup and configuration, as well as requiring software to be installed on the Exchange server itself.

In EV 10.0.3 we are pleased to be able to announce our integration with Exchange OWA 2013 and Microsoft Office 2013 using the new Office Mail Apps feature. By going down this route we are using integration points recommended by Microsoft, as well as significantly simplifying the configuration of OWA for use with Enterprise Vault.

In addition, by delivering EV integration using an Office Mail App we can more easily deliver support across the range of browsers and form factors supported by Microsoft OWA 2013. In the initial release support for the full range of browsers across desktops, tablets and phones is pending, and we plan to extend this in due course.

For more detailed information including videos demoing the new EV OWA Mail App functionality, and showing how easy it is to deploy, use and troubleshoot them, check out this the following links http://www.symantec.com/docs/HOWTO83364 and the corresponding feature briefing: http://www.symantec.com/docs/DOC6304

1/22 Customer Webcast: Security Insights at Your Finger Tips with SPC Mobile

$
0
0

Please join us for a customer webcast on January 22, 2013 on Symantec Protection Center Mobile/Enterprise, Symantec's next generation security management solution.

To register:

https://symantecevents.verite.com/27995/55102

 

WEBCAST DETAILS

Security Insights at Your Finger Tips with SPC Mobile

Date:
Tuesday, January 22

Time:
10:00 AM PT / 1:00 PM ET

Presented by:
Joe Bertnick, Director of Product Management, SPC
Shishir Agrawal, Sr. Product Marketing Manager, SPC

Abstract

Security professionals are grappling with information overload as the volume of data captured increases in light of more rigorous compliance demands, increased infrastructure complexity, and a challenging threat landscape.

To address these challenges, Symantec recently announced the next generation Security Management Solution, Symantec Protection Center (SPC) Mobile for IT security leaders.

In this webcast, you will learn more about SPC Mobile, an exciting new iPad app for Security leaders and get a live demo of how you can:

  • Have a single, comprehensive view of your security program
  • Clearly communicate IT security status to all business stakeholders
  • Demonstrate the value of your existing IT security investments
  • Show business-relevant metrics for your security solutions
  • Identify which emerging threats might impact your business

 

Thanks

Shishir Agrawal

Sr. Product Marketing Manager, SPC

File System Archiving now supports Windows Server 2012

$
0
0

 

Hopefully you have seen our announcement that Enterprise Vault 10.0.3 is now generally available. This is an important release for us as it brings support for the latest wave of Microsoft products. I’d like to take a little time to talk about the File System Archiving (FSA) support for Windows Server 2012.

Windows Server 2012 brings a wealth of new storage features to the table which will greatly help a Windows or storage administrator. However, one of the inevitabilities of life these days is the ever increasing data volumes. Whilst Windows Server 2012 will allow you to scale to greater storage capacities, it does little to actually curtail the data growth.

One of the new features of Windows Server 2012 is the addition of block level deduplication to a NTFS volume. This will allow you to squeeze more files on to your disks, but it really just buys you some more time until those volumes still become full. Deduplication here is on a volume by volume basis (not global) and according to Microsoft’s own numbers, will likely only gain you around 30-50% additional capacity for user shares.

I’ve been asked recently whether Windows deduplication would negate the need for File System Archiving. I would strongly argue that while Windows deduplication is nice to have, it does not negate the need for FSA:

  • With the majority of data not being accessed for 6 months or more, why leave it on primary storage? Bringing it in to Enterprise Vault will free up far more storage than Windows deduplication and will actually prevent the need to enable Windows deduplication in the first place. It will also mean the likelihood of your volumes becoming full will be greatly diminished.
  • Enterprise Vault applies retention to files that are archived. This means you do not have the files sitting around for eternity, but that they will be automatically deleted once they have passed their useful working life. Windows has no native retention capabilities.
  • Enterprise Vault provides global deduplication of data. Not only will we deduplicate the same files, irrespective of how many folders, shares and servers we see it on, we will also deduplicate it if we see it as an attachment in an email or stored within a library on SharePoint. Data within Enterprise Vault is also stored compressed. What this all means is that your storage footprint within Enterprise Vault is going to be considerably smaller than just enabling Windows deduplication.

There are many more benefits of using Enterprise Vault (indexing, eDiscovery readiness, retention folders, wide choice of archival storage etc.) but I just wanted to call out how I see File System Archiving and its relevance when it comes to Windows Server 2012 and deduplication itself. If you have decided to implement Windows deduplication, then the good news is that you can still use File System Archiving against those volumes. We’ve engineered the product to ensure that both are not mutually exclusive.

One last footnote before I sign off, in Enterprise Vault 10.0.2 we introduced support for File System Archiving on Windows 2008 R2 Core server. In 10.0.3 that support follows through to Windows Server 2012 but also includes the capabilities to use File Blocking and FSA Reporting on Core servers. It is now much easier to switch from the full GUI installation of Server 2012 to Core Server 2012. As a result we anticipate many more customers deploying Core server, so this is a welcome introduction to our platform proliferation.

To find out more about File System Archiving and Windows Server 2012, then check out the feature briefing that has just been published. 

シマンテックインテリジェンスレポート: 2012 年 12 月

$
0
0

12 月のシマンテックインテリジェンスレポートでは、脅威を取り巻く環境に関するシマンテックの世界的な統計を詳しく分析しています。12 月に米国は、スパム(12.7%)、フィッシング詐欺(24.2%)、ウイルス添付ファイル(40.9%)のいずれについても世界最大の発信源であるという、ありがたくない名誉に浴することになりました。リスク別の発生状況について、3 つのカテゴリのうち 1 つか 2 つで米国がトップに立つことは珍しくありませんが、三冠に輝くというのはいささか異例なことです。

そのほかの国別ニュースとして、ノルウェーが最も頻繁にフィッシング攻撃で狙われる国に躍り出ました。12 月のノルウェー国内では、81.4 通につき 1 通の電子メールがフィッシングメールだったことが確認されたからです。ノルウェーは、全世界のフィッシング攻撃の 20.2% と、フィッシングの発信源としても 2 位を占めています。ある国がこのように突出する理由はたいていさまざまですが、申し合わせたように攻撃者のフィッシング活動が集中した可能性は高いでしょう。シマンテックはこの急増について監視を続けており、来月あらためてお伝えする予定です。

また、全世界のスパム量の中では風俗/出会い系の比率が圧倒的で、スパム総数の実に 82.6% を占めました。風俗/出会い系が医薬品スパムをおさえて首位に立つのはよくあることですが、その差がここまで開くことはあまりありません。フィッシングの標的となる業種としては、銀行業が再び最多となり、12 月の全攻撃のうち 65% に達しました。そして、全アドウェアの 80% 以上が、汎用の検出定義で検出されました。これも特に意外な結果ではなく、こうした悪質なプログラムの大部分は汎用の検出として捕捉されるのですが、その比率がここまで高いのも稀です。アドウェアの作成者が目新しい、あるいは独自の攻撃を試みなかったと考えられますが、おそらくはホリデーシーズンで一息ついていたのでしょう。

今月のシマンテックインテリジェンスレポートをダウンロードし、ぜひご活用ください。

Viewing all 5094 articles
Browse latest View live




Latest Images