Articles Home |
This article applies especially to OLTP (On Line Transaction Processing) database systems. It can also be used for data warehouse systems for a database report tool.
OLTP systems are can be brought to a crawl by report processing on the server. Not so good when the main purpose is to do transactions, not reports. Move processing to the analyst's computer and save the database server for doing transactions.
The secret tool is that most analysts already have good tools for reports that they understand and use every day. Spreadsheets, cheezy personal databases and even statistical packages like SAS or SPSS are used by analysts all the time.
The only way to do any reports from an OLTP database is to give the data from an SQL script in raw ascii format to the analyst. Pretty much a table dump, or an efficient join or two wrapped in perl (DBD-DBI) or a shell wrapped sql interpreter interface (sqlplus, dbaccess, etc.) Format the data to be tab delimited or CSV or whatever common format can be used by the analyst and his tools. Also give the analyst the metadata for the data, an analyst cannot analyze in ignorance. This makes an efficient report on the OLTP server, the analyst who wants the report does the data manipulation. The business analyst will then play play with the data, move columns around, aggregate, combine, summarize, graph, whatever. The analyst will also understand the source of the data better, study of the metadata and database schema can only help business processes.
The traditional report method has too much pain, i.e, have some eater-breather direct a programmer to total a column in a report and right justify it on the page. A couple days later the analyst asks the programmer to left justify the column on the page because a Vice President can't fit it next to the funny cartoon in his Power Point. All of this is done in some crude, arcane report writing language or badly designed, over-complex, super expensive Crystal reports garbage. What a waste of resources, computer and human. And after all that monkey business the analyst will extract what they want and put it in Excel anyway if they want to do some analysis.
Any business analyst that cannot understand the metadata and manipulate the data they work with should be fired. However, competent analysts are thrilled to play with the data in the tools they use everyday instead of waiting for a report to get modified using a mysterious tool on a black box of source data.
Letting the analyst use his own tools keeps report change impacts to a minimum. Why? The analyst must also work on any changes to a report as the report comes from the analyst's own tools. And chances are that the base data (what I provide) does not change as much as the analysis of the data done by the analyst.
Fancy-schmancy query tools such as "Business Objects", "Crystal" etc. are not Cheap and sometimes not too Quick. They do have promise and anyone with 4-5 extra staff and a large wad of cash can do something with them.
The expensive resource sucking database query tools basically do what is described above, move processing to the analyst's computer, some even let the analyst use other tools to analyze the data and involve the analyst in the metadata. But many of these database tools are complex, have bad interfaces and limitations that make them unusable. A lot of them are targeted at data warehouses, not OLTP, they can affect the database server performance on production OLTP systems. They look good in a ten table demo schema. Buyer beware.