Thursday, December 13, 2007

How to Do Database Logging/Monitoring "Right"?

So, people sometimes ask me about how to do database logging/auditing/monitoring and log analysis right. The key choice many seem to struggle with for database auditing and monitoring is reviewing database logs vs sniffing SQL traffic off the wire. Before proceeding, please look for more background on database log management, auditing and monitoring in my database log management papers (longer, more detailed - shorter) The table below summarizes the situation with database monitoring and auditing - now you can make your choice more intelligently (items in bold are the ones I consider key):

Pro Con
Sniff SQL traffic from the wire
  • No database performance impact
  • Awareness of returned content (for SELECTs)
  • Guaranteed role separation
  • Better for DBA monitoring
  • No agents
  • No database configuration changes
  • Extra device needs to be purchased, deployed and managed
  • Doesn't work with encryption
  • No local access monitoring
Collect and analyze database logs
  • No extra $$$ - use your existing logging tool
  • Can user review activity across log sources, from databases to servers
  • Satisfies compliance demand for "database log review"
  • Can monitor ALL access to data in the database, even over APIs and local
  • Performance impact possible (*)
  • Database config changes needed
  • Usually not truly "real-time" (polling)

Choose logs if you care for the relevant Pros (esp key ones) associated with them; choose sniffing if you care for the Pros and are NOT undermined by their Cons (e.g. lack of support for encrypted traffic)

Comments? Additions? Concerns?

(*) Nobody really knows what it will be in each particular situation: 0-40% were observed under various conditions by various people ...



UPDATE: Rich adds his option #3, but I am skeptical since it is not very sexy. Dedicated agents on each databases just aren't that exciting...

4 comments:

Anonymous said...

Anton,

First, let me thank you for brining the topic of database monitoring and protection to everyone's attention. The security professionals are starting to realize that data and information should be protected (e.g. http://www.slideshare.net/anton_chuvakin/interop-2007-keynote-teaser ).

For the sake of full disclosure, I would like to mention that I am responsible for products strategy at a Database and application security company (See http://www.imperva.com ) that provides a complete Application Data Security and Compliance solution.

While I do not want to argue your comments (for example, some network based solutions, such as ours, can parse encrypted SQL traffic in real time)I do want to add few comments:

1. DB logging alone is not sufficient as it lacks the application context. For example, most of the enterprise applications (SAP, Oracle EBS etc) use an internal user to query the database. As a result, the logs are filed with “sapdb” user instead of the users identify. BTW, some of the companies that provide logging -only or local agent solutions are now adding other agents to be installed on application servers to solve this challenge.

2. Prevention and selective enforcement. I think that you wrote that prevention will never supplant detection. I concur. Detection is a mandatory requirement for prevention. Combining the two in real time typically requires a network based solution.

Last, I would ask you to think about the hybrid approach - local low profile agent to deal with dead spots and network based gateways and servers, combining log management for risk mitigation.

10x, Sharon

Anonymous said...

Anton - thanks for surfacing this topic. It's refreshing to see an unbiased opinion on the logs versus sniffer approach. The fact remains - the only way to capture an accurate, defensible, standup in a court of law representation of "who did what" - is to combine part native audit - part log reading - part sniffer, and of course, as Sharon suggested, traceability back to the "named" user. (Lumigent does not require an appserver agent to capture app context) Mike Spiers / Compliance/Audit/Security - Solutions / Lumigent Technologies

Anonymous said...

Anton,

I'd also like thank you for bringing the topic to the forefront.

Like Sharon and Mike, I also have a vested interest. I am the Co-Founder of SoftTree Technologies, developers of DB Audit Expert, www.softtree.com.

The reality is that database auditing and compliance reporting is a relatively recent information security requirement; one that has been necessitated by regulatory mandates such as SOX, HIPAA and PCI. Let's not lose sight of the objective. If the interest in the technology is regulatory compliance, the performance impact associated with utilizing a database's native auditing feature is negligible. The key is properly configuring the auditing package to only capture the events necessary to report on issues germane to regulatory compliance. Specifically:

1. Recently created, deleted, or modified users and logins
2. Inactive users with active accounts
3. Users with expired passwords
4. Users with non-expiring passwords
5. Users having administrative privileges
6. Recent administrator logins
7. Recent privileged operations
8. Recent granted and revoked privileges
9. Data changes by priviledged users.

Unforrtunately, the choice between server-side and network-based solutions is often predicated upon inaccurate information, vendor hyperbole, and technical bias.

I also agree with Mike Spiers, "the only way to capture an accurate, defensible, standup in a court of law representation of "who did what" - is to combine part native audit - part log reading - part sniffer, and of course ... traceability back to the "named" user." This is why SoftTree has licensed technology to several players in the space, including Tizor Systems. There are compelling reasons to use all three auditing methods.

As for mapping real user names to application context, it is not difficult. Lumigent and SoftTree have been able to do this for a while now.

Ultimately the right choice comes down to organizational objectives and resource constraints. May companies simply don't have the resources to properly utilize the feature set inherent in products like Imperva, Tizor or Guardium appliances. Such organizations want cost-effective, easy-to-deploy, install-and-forget solutions to their database auditing and compliance reporting problem. That's where SoftTree shines!

Anton Chuvakin said...

Wow, thanks for the enlightening comments. It looks like everybody favors the "hybrid" approach of looking at both traffic and logs (although with an agent, not using agentless, remote log collection) as well as understands the limitations of the other approaches!

Dr Anton Chuvakin