LAB IC13 Integrating DeepSight Intelligence into 3rd Party Solutions Hands-On Lab...
Transcript of LAB IC13 Integrating DeepSight Intelligence into 3rd Party Solutions Hands-On Lab...
LAB IC13 Integrating DeepSight Intelligence into 3rd
Party Solutions Hands-On Lab
Description You’ve installed Snort, Splunk, or ArcSight to collect log data, now what? Session attendees will learn how DeepSight DataFeed can be integrated into their existing log management solution to identify which threats are revelant and help prioritize mitigation activities, enabling you to increase the value from your existing investment.
Attendees should be familiar with 3rd party products, data integration processes and technologies such as SOAP.
At the end of this lab, you should be able to
Understand how to Access DeepSight data feeds
Understand the types of 3rd party products they integrate with and how the integration takes place
Automatation of DeepSight data feeds with Microsoft Integration Services
Use of Symantec integration technologies to such as Symantec workflow to assist with integration
Understand how to perform integration with other products such as Bind9 and MS DNS Services using the Domain/URL Reputation DataFeed
Notes A presentation will introduce this lab session and discuss key concepts.
Due to licensing restrictions the class will be shown the 3rd party products by the instructor.
The lab will be directed and provide you with step-by-step walkthroughs of key features.
Feel free to follow the lab using the instructions on the following pages. You can optionally perform this lab at your own pace.
Thank you for coming to our lab session.
2 of 35
This LAB is broken into three main sections, working with UNIX and a DNS SInkhole, then working with Windows and a DNS Sinkhole, then finally a look at using other Symantec tools such as Workflow that can assist with integration into an environment. The concept here is to show multiple methods of obtaining data minimalizing on the time needed to perform a full development project using standiard development tools such as Microsoft C#.
The LAB Instructor will also present numerous topics that will assist you in understanding the other 3rd party products and how they can assist you in your environment.
Using DeepSight to Perform a DNS Sinkhole with MS DNS
In this lab, because of the extra steps involved with setting up processes to integrate with MS SQL, we have created only some steps for you to follow to gain an understanding of how this process works.
Update File for DNS Processing
Ensure you are in the command prompt.
Change directory to C:\Data\DSIntProj
From the prompt type notepad AddZone.vbs and press <enter>
Modify the line with ??????.xml to domainblacklist.txt as indicated below:
3 of 35
Creating Command Script for DNS Processing
Form the command prompt type the following
Notepad AddZones.cmd
Update the command script with the following lines:
@echo off
echo Processing domainlist....
cscript //NoLogo //U addZone.vbs
echo Processing domainlist complete...
Later this script will be tied into the process we are building here.
Create the MS SQL Agent Job for the DTS Package
From your computer Desktop click on Start->SQL Server Management Studio.
Click on Connect to continue logging into the SQL Server.
4 of 35
First lets take a quick look at the data that was imported into the StagingDB. To do this drill down onto Databases->StagingDB->Tables->dbo.ImportReady and then right click on the table ImportReady and choose Select Top 1000 Rows.
5 of 35
You will see a screen open up on your right with the query command and then the output of the data.
Drill down to the Jobs under the SQL Server Agent.
Right click on the Jobs folder and select New Job…
Give the Job a meaningful name like DeepSight Integration Reputation Feed.
Click on Steps.
6 of 35
Click on the New button locsted on the bottom middle of the screen.
7 of 35
Modify the Step name to Run DTS Script and then selet the Type as Operating system (CmdExec).
Type the following into the Command line area on the screen:
"C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\DTExec.exe"
/FILE "C:\Users\Administrator\Documents\Visual Studio
2008\Projects\DSIntProj\DSIntProj\bin\DSProcess.dtsx" /MAXCONCURRENT "
-1 " /CHECKPOINTING OFF
8 of 35
Click on Advanced and modify the On success action to Quit the job reporting success. Then click the OK button bottom right.
Click on Schedules on the left of the screen.
Then click on New found at the bottom of the screen to setup a New Job Schedule.
9 of 35
Give the Job a name.
Set the frequency and then click OK
You will see a schedule like the following created:
Click OK again to return the Jobs folder.
10 of 35
Now the Job can be tested by simply right clicking and selecting Start Job at Step…
You will see the Job start executing, it will take some time to finish as it processes the data.
11 of 35
If you execute the command within a command window you will see the following output, notice esch task is operated on with a % Complete indicated.
This concludes the automation steps required to bring the data from the datafeeds to your system.
12 of 35
Building the DNS SinkHole for Microsoft DNS Server
Start the MS SQL Server Management Studio and access the ImportReady table under the StagingDB database. Right click and run the Select Top 1000 Rows.
You’ll see the following SQL.
Modify this SQL to the following:
SELECT Distinct
[domain]
FROM [StagingDB].[dbo].[ImportReady]
WHERE [reputation] >= 7
and [domain] not like '%.%.%.1%'
and [domain] not like '%.%.%.2%'
and [domain] not like '%.%.%.3%'
and [domain] not like '%.%.%.4%'
and [domain] not like '%.%.%.5%'
and [domain] not like '%.%.%.6%'
and [domain] not like '%.%.%.7%'
and [domain] not like '%.%.%.8%'
and [domain] not like '%.%.%.9%'
and [domain] not like '%.%.%.0%'
13 of 35
Execute the Query to test that it returns a result, it should look like this.
14 of 35
Now we need to crate a simple package based on the above query to export out to a CSV file. To do this we use MS SQL Server Management to generate a DTS package that we will then call from our project. To start right click on StagingDB and select Export Data under the Tasks menu.
Click Next
15 of 35
Set the following to the local database:
Select Flat File Destination and type in C:\DSIntProj\domainblacklist.csv
16 of 35
Select to Write a query
Place the query in the form:
SELECT Distinct
[domain]
FROM [StagingDB].[dbo].[ImportReady]
WHERE [reputation] >= 7
and [domain] not like '%.%.%.1%'
and [domain] not like '%.%.%.2%'
and [domain] not like '%.%.%.3%'
and [domain] not like '%.%.%.4%'
and [domain] not like '%.%.%.5%'
and [domain] not like '%.%.%.6%'
and [domain] not like '%.%.%.7%'
and [domain] not like '%.%.%.8%'
and [domain] not like '%.%.%.9%'
and [domain] not like '%.%.%.0%'
17 of 35
Click Next
Selct to Save SSIS Package to the File System Then click Next
Type CreateBlacklist and notice that this file name is labeled the same within the documents directory. Then click Next
18 of 35
Click Finish
Look for a successful creation of the domainblacklist.csv file.
19 of 35
Note that the dtsx file was created in the documents directory during the process.
Locate the domainblacklist.csv file in C:\Data\DSIntProj directory.
Now that we have the package automated we need to go back into the designer and make a couple of updates. We have automated our content to be placed directly into an MS SQL table which gives us automation tools that we can use to extract the data. Open the SQL Server Business Intelligence Devepement Studio.and then open your project DSIntProj.
20 of 35
Drag Execute Package Task onto the workarea and double click on the task.
You can update the name and description.
Click on Package and then select File System and then <New Connection>
21 of 35
Browse to the Documents directory and select CreateDomainBlackList.dtsx. Then click OK
Check the package by right clicking and then selecting Execute Task
Exit back to the Desinger and close the Tab for the CreateDomain process.
22 of 35
Link up all the processes like the following.
You now have a process that will take a feed and deliver a txt file with the D/URL feed with any website with a reputation of 7 or above. Finally lets write and run the script that is needed to update the DNS Server to finalize the MS DNS SinkHole Senario.
First drag another Execute Process Task and place it on the design page, then double click on it.
23 of 35
Fill in the general details.
Uopdate the details by adding AddZones.cmd the working directory C:\Data\DSIntProj and indicating the the window be Hidden.
Join your tasks to complete the process.
24 of 35
Try running the Process watching that all pieces are created. Startup the DNS Manager to view the items updated. Please note that to process this data could take up to 5 to 10 minutes depending on the speed of your system.
Using the Symantec Management Platform to Pull Data and Run Reports
Here we will see how to integrate the data that was created by the DeepSight feed into an existing Symantec product, SMP 7.1 (Altiris Symantec Management Platform). Keep in mind that these simple concepts can be extended into Symantec Workflow as well, which provides numerous ways for you to integrate this data into other products that provide API or database calls.
First start up the Symantec Management Console.
25 of 35
Now startup the SQL Server Management Studio.
Locate the Symantec_CMDB and right click on it and select Import Data.
Select Next to continue.
26 of 35
Fill out the remote server name NS71 and table StagingDB.
Select the local databse . and database Symantec_CMDB.
Select to Copy the data.
27 of 35
Leave the displayed default settings.
Leave the displayed default settings and click on Next. Please note that you could also save this as an SSIS package. If you remember in the previous section, we ran an SSIS package form another SSIS package. This is a method for creating a simple package that could run on a schedule from this MS SQL Server.
You should see something like the following.
28 of 35
29 of 35
After you execute you will see the processing screen. This may take a little while to process, you shouldn’t see any errors during this phase.
Now swicth to the SMP Console and select All Reports.
30 of 35
You should be at the Reports Screen, we are going to create a simple report to demonstrate the integration with SMP and the simplicity at which this can be achieved.
Select to create a new SQL Report.
31 of 35
Delete the SQL Code.
Type the following simple SQL Code: Use Symantec_CMDB Select * from dbo.ImportReady
Give the report a Name.
32 of 35
Save Changes.
You have just created a simple report within the Symantec Management Platform from the DeepSight datafeed.
Please note that all tasks shown within this LAB can be completely automated.
Thank you for your participation.
The following section is an example of performing the DNS Sinhole process using Bind9. Please note that this is an advance process and requires you to understand how to use VI and how to restart services and manipulate files within the Unix environment.
Please modify the VMWare settings appropriately so that this VM can access the Internet.
PLEASE NOTE: We do not expect people to be able to complete the following portion of this LAB during class because of time constraints, it is provided for informational purposes so that you can use and test it in your environment.
33 of 35
Using DeepSight to perform a DNS Sinkhole with Bind9
As the Unix environment is a Server environment there is no GUI, as such the first half of this guide will just walk through the process showing the scripts and methods you need to call within the Unix environment to perform these functions.
Prepare the configuration file and pull down Reputation DataFeed Information:
Build the following configuration files:
File 1: GetDataFeeds:
<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-
instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<GetDataFeeds xmlns="https://intelligence.symantec.com/">
<loginName>USERNAME</loginName>
<password>PASSWORD</password>
</GetDataFeeds>
</soap:Body>
</soap:Envelope>
File 2: GetBaseline:
<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<GetLatestBaseline xmlns="https://intelligence.symantec.com/">
<loginName>[YOUR USERNAME]</loginName>
<password>[YOUR PASSWORD]</password>
<feedIdentifier>[FEED IDENTIFIER]</feedIdentifier>
</GetLatestBaseline>
</soap:Body>
</soap:Envelope>
Pull down the configuration files:
curl -vvv -H"POST /CustomerDataFeeds.asmx HTTP/1.1" -H"Host: localhost" -H"Content-Type: text/xml; charset=utf-8" –H "SOAPAction:https://intelligence.symantec.com/GetDataFeeds"
34 of 35
-d@GetDataFeeds https://intelligence.symantec.com/CustomerDataFeeds.asmx > GetDataFeeds.out
View the Results:
cat GetDataFeeds.out
Get the Latest Version of the Domain/URL Reputation Feed and Build the Watchlist:
Pull down the latest version of the Domain/URL Reputation Feed:
curl -vvv -H"POST /CustomerDataFeeds.asmx HTTP/1.1" -H"Host:
localhost" -H"Content-Type: text/xml; charset=utf-8" -H"SOAPAction:https://intelligence.symantec.com/GetLatestBaseline" -d@GetLatestBaseline https://intelligence.symantec.com/CustomerDataFeeds.asmx > GetLatestBaseLine.out
Verify that the Domain/URL Reputation DataFeed pulled down correctly:
cat GetLatestBaseLine.out
Extract and decode the Domain/URL BaseLine:
cat GetLatestBaseLine.out | sed
's/.*<Contents>\(.*\)<\/Contents>.*/\1/' | base64 -d - > black_list.zip
Unpack the zip file:
unzip black_list.zip
35 of 35
Create the Domain Watchlist:
sed -n -e 's/.*<domain name\=\"\(.*\)" reputation.*/\1/p' [FILE NAME].xml > domain.out
View the Watchlist:
cat domain.out
Add the Watchlist to BIND and Create the Sinkhole:
Edit named.conf to add a separate entry for the malware zone files:
include ”/var/named/bad/reputation_domains.conf”;
Create a new zone file called malware.zone:
$TTL 86400
@ 1D IN SOA localhost root (
42 ; serial
24H ; refresh
24H ; retry
24H ; expiry
24H ) ; minimum
IN NS @
* IN A 127.0.0.1
Add in master zone file references around the domain names in the watchlist:
sed 's/^/zone "/;s:$:" IN { type master; file
"/var/named/bad/malware.zone";};:' domain.out > reputation_domains.conf
Verify that reputation_domains.conf looks similar to this:
zone "005im9t.dns2.us" IN { type master; file "/var/named/bad/malware.zone";};
zone "00zzqcd.ns1.name" IN { type master; file "/var/named/bad/malware.zone";};
zone "01027846502.kt.io" IN { type master; file "/var/named/bad/malware.zone";};
zone "01030501519.kt.io" IN { type master; file "/var/named/bad/malware.zone";};
zone "01047506307.kt.io" IN { type master; file "/var/named/bad/malware.zone";};
Restart BIND 9 and the Sinkhole is now running:
su –
service named start