Siem 2 - Log management evolved

download Siem 2 - Log management evolved

of 20

Transcript of Siem 2 - Log management evolved

  • 8/13/2019 Siem 2 - Log management evolved

    1/20

    This work is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License.To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/4.0/deed.en_US.

    1

    !"#$ &'( ) *+, $-.-,/0/.1 #2+32/4

    5+6 3+,,7., 8981/08 :-. 0//1 ;-+.-1?-. @-A/ BC+.:D3-:E.-/'+A,F

  • 8/13/2019 Siem 2 - Log management evolved

    2/20

    "#$%& '( )'*+&*+,

    "#$%&'( )*+*'(',* -

    *+.'+/+0) -

    %1)2,')) )3',+#2$ +,4 *0"23+& #'512#'(',*) 6

    "&+,+7"#'8',*2$, 6"&+, %74'*'3*2$, 6"&+, 372,8')*29+*2$, :&$9 (+,+9'(',* 3;+&&',9') ',)1#' )'31#2*0 2,*'&&29',3' )0)*'() +#' "&+3'4 2, *;' +""#$"#2+*' )'31#2*0 4$(+2, ?>

    ',)1#' '@@'3*28' +33')) 3$,*#$&) ?>',)1#' %29 4+*+ 2) ;+,4&'4 %0 *#+2,'4 +,+&0)*) ?>',)1#' '("&$0'' 3$((1,23+*2$,) +,4 31)*$('# 3$((1,23+*2$,) #'2,@$#3' *;' ',*'#"#2)' 3$((2*(',*

    *$ 31)*$('# "#$*'3*2$,A'("&$0'' "#28+30A+,4 2,@$#(+*2$, )'31#2*0 ;092',' BC

  • 8/13/2019 Siem 2 - Log management evolved

    3/20

    Problem Statement

    Enterprises are experiencing an increasing incidence of cyber-attacks on their infrastructure,

    applications and data sources. Previous attempts have attempted to categorise these into the

    external attacker and the insider threat. Due to increasing attacker sophistication, thearchitectural shift to transact information across organisational boundaries, and the growth in

    demands such as BYOD, these definitions are coalescing. When searching for signs of a breach, a

    security analyst is typically investigating not only the known threat reported by rule-based

    systems, but increasingly the unknown threat, previously hidden in credentialed activity,

    anomalous behaviours disguised by large volumes of machine-to-machine and human-to-machine

    transactions, and is only observable using analytics on large data sets. Security Intelligence

    deployment should be incorporated into the overall enterprise security improvement programme,

    and care should be give to ensure that information protection risks and privacy concerns are

    addressed.

    Takeaways

    ! The current drive around rapid detection focuses heavily on re-introducing SIEM as a keytool in the information security architecture. Effective security incident and event

    management requires not only strong technological solutions, but also expert analysts to

    distill events that matter from the slew of reported information.

    ! The fundamental challenges of log management do not change, and one of these is that thisis an expensive process that requires a rare breed of analyst, with a mix of technical

    security knowledge, business priorities, and focus to derive a complex threat pattern from

    multiple sources. Since SIEM is a definable process (inputs, outputs and KPIs) it is suitable

    for out-tasking to a specialized company.

    ! Generation 1.0 SIEM was notorious for failing to deal with a high load of data directed at it,particularly when endpoint systems (such as workstations) were added. Generation 2.0

    specifically sets out to address this, doing away with the concept of normalization, and

    instead determinedly embracing the stream of what can decidedly be described as Big Data,consisting of all internal events across the enterprise.

  • 8/13/2019 Siem 2 - Log management evolved

    4/20

    Business Scenario and Typical Requirements

    As the complexity of the overall IT architecture grows, incorporating new and existing (plus legacy)

    applications into its overall integration mix, the amount of transactional information has grown

    beyond the capability of traditional log recording and SIEM systems. When other security-relevant

    sources are included, such as infrastructure components, desktop, and even relevant social media

    sources, HR systems, and physical security systems, the amount of data can quickly overwhelm a

    traditional SIEM based around relational databases.

    While Information Security is decidedly an abstract concept noticeable more by its absence than its

    presence (for example during a breach), a mature security organization will have three primary

    phases to its security posture.

    Plan A - Prevention

    Prevention starts with detailed understanding of the threat modelthat the organization (or

    business process) is subject to. Given a good threat model, it is then possible to visualize the

    preventive controls that must be applied. Preventative controls are commonly applied during the

    architecture, design and implementation phase, although as the threat model changes it is

    important that it is possible to modify, add or even remove controls. Operational controls include

    patching, penetration testing, firewalling, whitelisting, endpoint security and the like. Increasingly,

    the software development lifecycle is coming under scrutiny, with requirements code analysis,

    penetration testing being considered a minimum safe standard.Most important of all are

    applications made available publicly, including mobile and web these are becoming prime fraud

    vectors for a monetised and motivated attack group.

    Risk Management can be a thorny and unpleasant task evaluating each threat in the model and

    deciding whether there is an effective return on investment in mitigating threats. Because of the

    asymptotic nature of the typical impact graph, actuarial approaches such as annualised loss

    expectancy are rarely effective in the heat and thunder of an actual breach.

    Plan B - Detection

    Operationally, it is expected that a publicly launched service will attract users with a wide variety of

    motivations, some of with will be malicious. Hence it is important that the design controls are

    backed up by robust operational processes. In the context of this paper, it is not enough to design

    and specify an effective SIEM, it must be operated competently and diligently, which is a resource-

    intensive task requiring skilled staff. The SIEM solution must be able to cope with not only

    expected threats, but also Black Swansecurity events. As an example such events can include a

    zero-day vulnerability in WebSphere, Cold Fusion, or Java, or even a flaw in a PIN entry terminal.

    Therefore a key part of a detection strategy is anticipating the failure of technical controls, which

    then logically follows that the capability to detect earlyis an advantage. It must be emphasised

    that third party notification whether it be key customer, your bank, or a random security

    researcher is a failure in the detection strategy. Having the ability to look at one dashboard

  • 8/13/2019 Siem 2 - Log management evolved

    5/20

    showing "wierd" events, and correlating antivirus, system event logs, IDS and vulnerability scans

    can massively speed detection, allowing the creation of an intruder's dilemma1and the limitation of

    attacks before they become breaches.

    Its important to note that detection tools need not necessarily involve huge capital expenditure. A

    variety of tools exist in both the open source world, including OSSEC, suricata, Snort, and Syslog-

    NG. A limitation of open source has traditionally been the lack of a dashboard, which can highlight

    important events and trends using visualisations, however frameworks such as ELSA now provide

    this functionality.

    Effective detection requires detailed understanding of application design and behaviours, in order

    to notice behaviours which are anomalous, such as a sudden access to a file-sharing service.

    Plan C - Investigation

    With this resilient model for operational security management, it is logical to expect that detection

    will fail, and that an attacker will breach the organisation. A generalised failing amongst

    inexperienced security managers is to assume that because the prevention and detection stages of

    the security model have been apparently effective, then it is safe to neglect the development of an

    investigative capability, and even to make bold statements such as If this ever happens, well go

    to the market and use external resources - in other words "We never think its going to happen to

    us".

    It should be emphasised that running a procurement exercise whilst in the throes of a crisis isextremely challenging, and the temptation to simply take the lowest cost option (or the one

    recommended by a third party) can be highly damaging. External resources engaged ad-hoc will

    lack the detailed understanding of the enterprise incident and event management strategy, and will

    rarely have the focus of bringing the organisation out of a state of crisis and back into normal

    operation, focusing instead on tactical delivery points.

    Fortunately for the organisation that has effective preventive controls and strong detection

    measures, the investigative process is significantly eased and facilitated by these. Edge cases and

    unusual behaviours will provide key indicators in establishing a timeline of events leading up to

    (and following) a breach event. Like any disaster readiness plan, a good incident response

    readiness plan must be dry-tested.

    As SIEM evolves into Enterprise Security Intelligence, vendors are touting techniques and tools

    used in other Big Dataapplications, therefore it is important to define what this actually means.

    Big Data is defined as a three-V problem, that is:

    ! Big Volume more than can be handled by traditional SQL based database managementtechnology

    1http://taosecurity.blogspot.co.uk/2009/05/defenders-dilemma-and-intruders-dilemma.html

    2While researching for this article, it emerged that the number of IT managers who are still using PCI DSS as a driver is dramatically

  • 8/13/2019 Siem 2 - Log management evolved

    6/20

    ! Big Velocity styled as drinking from the fire hose, indicates that the stream of data isconstant and rapid

    ! Big Variety data is from a variety of sources, and includes both structured and un-structured elements

    It becomes clear from this example that one constraint on the success of Big Data in security is the

    quality of the Big Analytics that can be applied to it. No vendor on the market is making a claim to

    reduce the demands for strong analytical and statistical skills from the human analysts who are

    responsible for producing actionable intelligence. The errors of false correlation/causation, null

    hypothesis errors (both false positive and false negative) are well known in many fields.

    Therefore we can define Enterprise Security Intelligence as the collection of data from all IT

    systems in the enterprise that could be security relevant and the application of the security teams

    knowledge and skill, resulting in risk reduction.

    The SANS log management survey (2012) identified the top challenges in log management as:

    ! Identification of key events from normal background activity! Correlation of information from multiple sources to meet complex threats! Lack of analytics capabilities! Data normalisation at collectionData sources are many and varied, and may include both internal and external feeds, for example:

    ! Application data ! Physical Security! DHCP/DNS ! GPS! Netflow ! AD/LDAP! VPN ! Social Media

    It is clear from observing the example data sources above, that much of the information in the

    analysts scope will have at the very least a personal identifiable information (PII) characteristic to

    it. Potentially, application data could include data subject to specific compliance regimes such as

    PCI-DSS, Sarbanes-Oxley, Basel, or others.

  • 8/13/2019 Siem 2 - Log management evolved

    7/20

    Log Management Challenges

    In 2012, the SANS Organisation conducted a

    survey of its members asking What are the

    top three challenges you face in integratinglogs with other tools in your organisations

    overall information infrastructure?

    What we can derive from this is that there is

    a common pain felt at the inability to get

    timely, relevant information amongst the user

    community. In addition, log managers are

    coming to grips with the unfortunate reality

    that having technical solutions in place does

    not obviate the need for expert analysts the

    tools enhance, and do not replace, human expertise.

    Another noticeable issue is that of data reduction and normalization.

    A typical SIEM will only report useful information on a subset of all security relevant information in

    an organization. SIEMs largely correlate machine-to-machine data into Known events, which

    therefore means that Normal user and machine data will be filtered out of the information set,

    and hence unknown threats requiring analytics will be missed.

    In essence then, traditional SIEM architectures, whether managed internally or externally, require

    that the IT manager know what is required for investigation before the need emerges. It is also

    true to state that security relevant data can come from anywhere, not just sources specified by the

    vendor. The lack of scalability inherent in the funnel SIEM architectural model shown on the left

    restricts visibility by design, and means that events generated will tell the IT manager what is

    known, but not what is unknown. In the area of cold case forensic investigations, which includes

    many fraud investigations and data breaches, the data abstraction means that it is impossible to

    examine original data generated at the time, and that

    all that is left to the investigator are the event reports

    which failed to alert a breach incident in the first place.

    Looking at the cold case

    As mentioned earlier, much digital forensics work can

    be described as a cold case investigation. When a

    forensic investigations team is called in, it can be

    many months after the original data breach. In many

    cases, log event data is simply unavailable due to

    failure on the part of IT management to ensure it is

    kept (which is a violation of several compliance

  • 8/13/2019 Siem 2 - Log management evolved

    8/20

    standards, including PCI DSS2). This not only impedes the investigation, but also where a

    regulatory authority is assessing fines, can have a substantially negative impact.

    Another challenge to the investigator is that when

    viewing a condensed or normalised event log it is

    impossible to sensibly interpolate new data into old

    security events. Thus reinvestigating the breach in the

    light of new information, such as recently disclosed vulnerabilities, becomes a hard problem.

    The emergent mindset in investigative analysis involves a mix of convergent and divergent

    thinking, mixing pattern matching, statistical methods, with the traditional method of un-

    concealing. Baselines of activity are created, and then standard deviations of these norms are

    investigated. Abnormalities in user activity can be investigated using the following combinators:

    ! Location ! Role ! Data/Asset Type ! Time of Day! Action Type ! How long did

    the action take

    ! Data/AssetCriticality

    2While researching for this article, it emerged that the number of IT managers who are still using PCI DSS as a driver is dramatically

    reducing. IT expenditure boards widely regard PCI DSS as overly prescriptive, unwieldy, and hence it is seen less as an business driver,more as a regulatory constraint that must be managed to minimize business impact.

    Normative statistical analysis is the most

    important thing you can do.

    - Patrick Reidy, CISO FBI

  • 8/13/2019 Siem 2 - Log management evolved

    9/20

    Architectural Context

    In order to effectively utilise and safeguard a security

    intelligence programme, it is important to be able to place

    this not only within the context of a security improvement

    programme but also within a coherent security

    architecture. The specifics of security architecture will be

    based on the requirements of the enterprise and the IT

    services delivered, and also on the capability maturity of

    the enterprise at a point in time. As a cautionary note, it is important that a realistic picture of the

    enterprises maturity as regards information security is measured and understood. Understating

    the enterprise maturity leads to a lack of confidence in the ability to deliver and operate, while

    overstatement leads to embarkation on ambitious projects, with over-reliance on vendors,

    ultimately leading to abrogation of leadership in the delivery context.

    Our data inclusion model will look radically different to one used for traditional SIEM. No up-front

    normalisation, time-indexed data, Analytics and Statistics Commands, Correlation, and Pattern

    Analysis will all feature in the model, in somewhat sharp contrast to traditional SIEM event

    funneling.

    Specific points in the security architecture that will directly support the development and delivery

    of security intelligence capability include:

    Effective Information Lifecycle Management, in particular data classification. This should

    answer the following questions, which will identify key information assets:

    ! What data is available?! Where is it located?! What access levels are currently implemented?! What protection level is implemented and does it adhere to relevant compliance

    regulations?

    Help (and not hinder) privacy efforts. A security intelligence capability will process sensitive

    employee-related information, therefore in addition to the usual access controls and data

    safeguards, it is important to consider the effect of this capability on employee morale,

    management decisions, and indeed on brand identity. A security intelligence capability should not

    be used instead of effective management communications, employee leadership and workplace

    ethics; instead it should support and enforce strong practices in these areas.

    Aid regulatory reporting. Security intelligence should provide actionable reports that improve

    the enterprise compliance stance.

    Operate within multiple IT security domains. Security intelligence should be able to consume

    and rate information from all IT security domains, and incorporate the trust level of the security

  • 8/13/2019 Siem 2 - Log management evolved

    10/20

    domain as a factor. For example, correlation of sentiment on twitter in the time frame before an

    attack on the enterprise also correlated with marketing and sales campaigns.

    Incorporate lessons learned and best practices, both internally and externally. The SANS

    log management survey referenced above reveals several key requirements that may be expected

    of a security intelligence programme. In particular, effective security intelligence requires raw data

    that has not been subject to normalising at the point of collection. Availability of this raw data

    allows for retrospective analysis of incident data, and correlation and analysis with previously

    unavailable information sources.

  • 8/13/2019 Siem 2 - Log management evolved

    11/20

    Proposed Solutions

    A security intelligence solution can

    broadly be seen as an evolution of

    the SIEM, addressing the issues of

    integration of various point security

    products, allowing processing and

    correlation of data in real time,

    improving the real time security

    posture, and reducing the costs of

    remediation in the event of a breach

    by quickly identifying and mitigating

    breach damages.

    Gartner have produced their magic

    quadrant of SIEM vendors which may

    add value to the product research

    process. It is not the purpose of this

    paper to provide vendor analysis,

    although the emergence of visionary product solutions which draw heavily from open source

    development is of great interest.

    A SIEM deployment framework can be defined as follows:

    PLAN

    In the planning stage, we can define the

    data sources (in broad sweep)

    An ever-present challenge to any enterprise

    is the data itself; increasingly the enterprise

    runs the risk of being overwhelmed by the

    sheer volume of data that may provide

    valuable operational and security

    information. This struggle is at the heart of the big data issue that has been recently expressed. A

    similar struggle exists with the drive to provide timely actionable intelligence from this volume of

    data, and hence the growth of next generation SIEM, providing security intelligence.

    DEPLOY

    A prevailing question in IT risk management is the likelihood of a particular security event or classof events. This becomes important when considering the tolerance of the enterprise to events,

    which at first look have an impact that threatens the survival of the business. Security intelligence

    ;=>?

    .@A=BC

    3A@D>E@+FB=F@

    +GA>?H

  • 8/13/2019 Siem 2 - Log management evolved

    12/20

    by its nature purports to give insights into the actual risk exposure by determining patterns of

    behaviours behind activities and alerting appropriately. However determining the impact and

    likelihood of black swan events (high impact, low likelihood) is one that will require skilled

    analyst input due to the nature of such an event, statistical information is rarely available.

    Data may be acquired from a number of sources both internal and external to the enterprise. Such

    sources include (but are not limited to) the following:

    IT Infrastructure

    ! Network Devices: logs from routers, switches, information from network access control(NAC) tools, and NetFlow data.

    ! Security Devices: logs from firewalls, IPS, and other security appliances! Servers: log files from servers in data centres and offices; includes physical, virtual and

    public cloud instances

    ! User end-points: device information, network context, access history, records ofownership and records losses

    ! SCADA (Supervisory Control and Data Acquisition) infrastructure: data about theoperation of and access to industrial control systems, their network mapping and access

    history

    Access Data

    ! Databases: Access logs. Tools such as Guardium may be used to monitor and controlaccess to database resources, and the events recorded from such a tool are key to an early

    detection strategy.

    ! Other data access information: Content use monitoring, data loss prevention, andcontent filtration systems

    ! Business Applications: access logs for both on-premise and on-demand applications.Application activity logs are frequently overlooked in the rush to SIEM and can provide

    key insights into unexpected and potentially unauthorized activity.

    ! Web access data: includes chokepoint information on web uploads and downloads; feedsfrom DLP tools and web filtration are key to ensure that this threat vector is accurately

    analysed by the SIEM

    ! Email records:Who has been sending what to whom? While content analysis is rightfullyfrowned upon as a privacy violation, traffic analysis using header information can be carried

    out in a warrantless (and thereby unrestricted) manner.

  • 8/13/2019 Siem 2 - Log management evolved

    13/20

    Vulnerability information

    ! Third party feeds: from other IT vulnerability assessment and mitigation systems such asTenable, Rapid 7, Qualys and FireEye.With the rapid growth in automated code analysis, it

    is conceivable that vulnerabilities discovered

    ! Softwareintegrityinformation: patch state of operating systems, firmware, databaseand applications, list of known flaws

    ! Known malware: Lists of known malware that may be used as part of more complexattacks. Given the rise in targeted malware and new endpoint attacks (such as PIN entry

    devices) it is important that the knowns list is kept up to date as much as possible.

    Indeed, it may be that endpoint protection focuses on whitelisting, reporting and preventing

    software that does not meet the whitelist criteria.

    User Information

    ! User Information: data from directories that defines authorised users and their assignedgroups, this includes information about current and past job roles. When correlated with

    Access data and vulnerability information, this becomes a key resource for identifying actors

    in an incident timeline

    ! Access Rights:current access rights for an individual or class of users! Privileged Access Rights: records of the temporary or permanent assignment of privilege

    to named users

    ! Guest Access Rights: information from network access control systems about areas ofnetworks enabled for guest access

    ! Third Party Access Rights: records of outside organisations and users that have beenauthorised to access infrastructure and applications

    ! Machine access rights: not all access is by humans; software applications and devicesare also regularly assigned access rights; for example to carry out automated system

    administration

    Other Data

    ! Change control systems: list approved system administration activity and highlight unpex! Location Data: IP and cellular geolocation indicating where access requests are originating! Regulatory/standards based information: As an example, ISO27001, which many

    enterprises have adopted as a baseline

    ! Industry bodies: providing advice to their members on known complex attacks and howto coordinate defence against them

  • 8/13/2019 Siem 2 - Log management evolved

    14/20

    ! Social media feeds: identifying increased levels of sentiment targeting an enterprise. Notethat sentiment may be positive or negative, and still be a powerful indicator of threat

    increase.

    ! Weather: unusual weather conditions in a certain area may account for observed largescale changes in user activity

    ! Time: accurate coordination is not possible without good timekeeping; an accurate sourceof time is needed across different systems and may be added to records to keep them

    pertinent over long periods. NTP serves this purpose well however sufficient access controls

    must be applied in the configuration.3

    3See Team Cymrus NTP templates at https://www.team-cymru.org/ReadingRoom/Templates/secure-ntp-template.html

  • 8/13/2019 Siem 2 - Log management evolved

    15/20

    OPERATE

    As can be seen, the apparent sensitivity of the various data feeds described above varies from very

    low to very high. However, in aggregate and when combined into a descriptive timeline by a

    trained analyst, there is a clear requirement for strong information protection requirements. Usingthe classical Confidentiality-Integrity-Availability model, then it can be simply stated as follows:

    Table 1 Data Assurance requirements

    Data item Confidentiality Integrity Availability

    Discrete data record Varies Varies Varies

    Aggregate dataset High High Medium

    Analyst reports High High Medium

    Incident resolution reports High High Medium

    The reasoning behind this is that once aggregated, much of the data will form a user-centric

    timeline, detailing the activity of human actors through the IT systems in the enterprise. This will

    include customer activity, employee actions, as well as malfeasance on the part of external

    attackers and disgruntled insiders. The need for privacy (that is, confidentiality) of this data is

    therefore high, and governed by various regulations (for example, the UK Data Protection Act, and

    in the case of US medical information, the Health Insurance Portability and Accountability Act).

    One of the prime goals for a security intelligence programme is risk reduction, accomplished

    through actions instigated by security analyst reports. Some actions will be relatively low-impact

    (for example, install malware tools on BYOD laptops at no charge); others may involve significant

    incident response intervention and liaison with law enforcement, at far higher cost.

    Several solutions exist to the challenge of safeguarding security intelligence source data, resultant

    analysis, and archived information. The enterprise can make an explicit choice, based on risk

    analysis of their environment, not to safeguard privacy or implement specific information

    assurance controls. This may be a sign of the relatively stable and low-risk nature of the business

    streams the enterprise is engaged in, or perhaps is a sign of the relative immaturity of the

    enterprise with regards to security awareness. It is probable that an enterprise at this low level of

    maturity will not fully realise the investment in Enterprise Security Intelligence, and thus it should

    be put further along the security improvement programme timeline.

    Having made a conscious choice to engage in deployment of Enterprise Security Intelligence, andhaving satisfied architectural constraints and pre-conditions, the architect can then make suitable

    choices as to storage of information assets relevant to the new system. These will include both

  • 8/13/2019 Siem 2 - Log management evolved

    16/20

    input and output assets as described above, and will include diligent role-based access controls for

    analysts, incident responders, IT administrators, risk managers, and IT executive management, all

    of whom will have different informational needs from the system. Protection of data in transit

    (stream-based cryptographic systems) is unlikely to generate new requirements as a result of

    deploying a Big Data system as described, however it should be recognised that existing data

    assets that become inputs to the system may have intrinsic protection requirements as a result of

    enterprise data classification rules or over-arching regulatory frameworks. It is considered that

    once aggregated (or collected), that appropriate measures are used to protect data in storage.

    Since Big Data Analytics requires pure raw data to be preserved, the most appropriate protection

    mechanism is a cryptographic one, as opposed to one using tokenisation or truncation. It must be

    emphasised that while cryptographic techniques are relatively easy to apply, the first-time

    enterprise adopter in the field of key management frequently faces operational challenges, and

    best practice guidelines such as ISO11568 (for financial services) and ISO11770 (as a more

    general model) are recommended. It should also be noted that a security intelligence package can

    use access requests to itself as an input item, thus providing a measure of useful self-protection.

    Another decision point to be made when deploying Big Data Analytics is how to communicate the

    change to stakeholders. Both employee and customer data is in scope of enterprise security

    intelligence programmes, and there is a risk of significant negative sentiment, with the

    accompanying risk of brand damage and lowered employee morale, should this be seen as

    intrusive on privacy. Many enterprises considering the adoption of this technology will be operating

    in a tightly regulated and controlled environment such as financial services, and thus the change in

    management style and working conditions will be minor. Many online services from retail to

    consumer finance make no secret of the extensive tracking of customer activity to drive business

    processes, however the all-inclusive use of Big Data analytics specifically to enforce and limit

    breach damage and other attacks may create concern. It is recommended that deployment of

    Enterprise Security Intelligence is proactively championed by executive management as part of the

    enterprise commitment to customer safety and privacy, and to assist the employees of the

    enterprise in defending against bad actors, and that the controls over stored data will enhance

    employee privacy, rather than lessen it.

  • 8/13/2019 Siem 2 - Log management evolved

    17/20

    EVOLVE

    Traditional SIEM

    SIEM is without doubt a key component in the modern security managers toolbox, and can provide

    indicators of operational and security events. Consultants are frequently asked What good is SIEM

    to me, and so it is worthwhile examining what typical threats are detected. A sample is shown

    below, together with example sources and watch points.

    Threat Phase Source SIEM Search Why

    Spear Phishing Infiltration Mail logs Affinity ofSender

    Spear phishing senderaddress is unlikely tohave communicated

    previously with theorganisations mail

    servers

    Bad Mail Links Infiltration Mail logs DomainAffinity URL is unlikely to appearpreviously in organizationweb servers attackerscan be fingerprinted

    Low/slowexfiltration

    Exfiltration Proxy/Firewall logs Average bytesper GET

    Small amounts of dataleaving in many sessions

    over time

    Form basedexfiltration

    Exfiltration Proxy logs Transaction:POST without

    GET

    Large amounts of dataleaving in few sessions.

    POST without GETimplies an automated

    process

    HTTP Command

    and Control (CnC)

    Exfiltration/CnC Proxy logs Long URL

    without areferrer

    Botnets commonly

    embed CnC messages inthe URL

    All these threats are well understood, and indeed could be argued that in the modern IT

    environment, pre-emptive and protective controls are put in place to guard against these.

    However, some other threats that are still being used to exploit vulnerable IT environments are

    detailed below, together with the kind of statistical approach that SIEM 2.0 enables.

    SIEM 2.0 Enterprise Security Intelligence

    When examining what SIEM has now become, its also worthwhile to examine the threats that themodern enterprise is experiencing which previously have been below visibility of the SIEM, and

    hence ignored.

    Threat Phase Source Statistical Search Why

    SQL injection Infiltration Weblogs len(raw)+2.5stddev Hackers put SQLcommands in the URL.Therefore URL length

    is standard deviationshigher than normal

    Password bruteforcing Infiltration Authentication logs short delta_time Automated passwordguessing tools entercredentials much

    faster than humanlypossible

  • 8/13/2019 Siem 2 - Log management evolved

    18/20

    DNS Exfiltration Exfiltration DNS and Firewalllogs

    count +2.5stddev Hackers exfiltrate

    data in DNS packet;standard deviationsmore DNS requests

    from a single IP

    Web Crawling Reconnaissance Web/FTP logs count(src_ip)

    +2.5stddev

    Web crawlers copying

    the web site forcomments, passwords,email addresses will

    be the source IPbehind page requests

    standard deviationshigher than normal

    Port Knocking Exfiltration/CnC Firewall count(deny) by ip Threat does inside-outport scan to identifyexfiltration points

    As can be seen, the use of statistical methods on data sources that would previously have been

    considered too large for a traditional SIEM setup can yield key information on attackers. Using this

    threat intelligence wisely allows effective use of incident response resources, including law

    enforcement, and minimises the breach window between active penetration and detection.

    SIEM 2.0 is likely to be a distributed implementation, with data sources being calculated and

    correlated at many discrete points in the enterprise architecture. The amount of event data is

    massive and to store it at a central point creates significant O(n2) scalability challenges both in

    network and storage infrastructure. Rather, it is preferred that SIEM 2.0 probe points contain

    intelligence enough to perform limited correlation, and send back correlation events to a central

    point, while retaining sufficient storage to allow more detailed analysis to be performed.

  • 8/13/2019 Siem 2 - Log management evolved

    19/20

    Recommendations

    When architecting, designing and implementing Enterprise Security Intelligence, the following

    recommendations should be considered:

    Ensure infrastructure will cope with increased load

    By its nature, large amounts of data will be transmitted and stored in a central point. This will

    place higher load on networking and server/storage components, possibly requiring increased use

    of direct attached storage (DAS), dedicated network topology, and higher throughput components

    such as routers and firewalls.

    Ensure data is classified appropriately

    Data will be gathered from diverse sources and re-purposed to provide actionable intelligence. In

    order to devise appropriate safeguards, this data must be classified effectively. Care must be given

    to consider the aggregate effect of new datasets gained by combining previously disparate and

    apparently unrelated data sets.

    Ensure security intelligence systems are placed in the appropriate

    security domain

    Since all information within the enterprise can be considered to be potentially within the scope of

    the security intelligence process, the systems should be placed in the IT security management

    domain, with restrictions and technical controls applied in line with enterprise security policies.

    Ensure effective access controls

    The information carried within a security intelligence system is deemed highly sensitive. Therefore

    care should be taken to ensure that access is granted on a least privilege required basis, with

    separation of duties wherever possible. Role based access controls must be defined for IT

    executive management, incident responders, IT system administrators, Risk managers, and

    auditors, and for any other roles with a legitimate business need to derive output from the security

    intelligence system.

    Ensure Big Data is handled by trained analysts

    Much of the analytical process is based around the use of statistical methods to determine

    probability. Given the relative new market presence from vendors, it is assumed that the

    requirement for trained data scientists to provide informed input and analysis is accurate. Security

    intelligence tools will need to make finely balanced decisions to avoid blocking valid, but unusual

    actions by bona fide actors. Therefore, it is important that inexperienced staff not introduce

    statistical errors and flaws. This in itself may well be a key business justification to investigate

    outsourcing the SIEM process to a solution provider.

  • 8/13/2019 Siem 2 - Log management evolved

    20/20

    Ensure employee communications and customer communications

    reinforce the enterprise commitment to customer protection,

    employee privacy, and information security hygiene

    As with any technical measure intended to improve enterprise information security posture, it

    should be deployed as an adjunct to an improvement in process and awareness. Given the wide

    ranging scope of a Big Data security intelligence tool, it is strongly recommended that the

    corresponding communication, education and awareness programme be equally wide ranging.