Evolution of the importance of Dataprix, according to Alexa


Today we have consulted our position in the Alexia ranking,  and we have been pleasantly surprised to find that this March we are located at or even above doorways as TDWI o BeyeNETWORK  are undisputed global benchmark .

Pentaho BI Server 3.5.2 stable


A few minutes are up to publish the stable versions of Pentaho BI Server 3.5.2.

You can download it at:

I have not reviewed this version, but I am doing beta testing before and have found no major changes in the UI, just some updates of bookstores, but not widen too much.

Whenever I get more information, I have the relevant comments.


Twitter will migrate from MySQL to Cassandra DB


Cassandra is an open source database distributed, is one of the noteworthy projects of the Apache Software Foundation and appears to be hitting hard. By being distributed enables high availability, fault tolerance and, above all, a highly scalable without loss of performance.

And were using companies that handle large amounts of data such as Rackspace, Digg, or Facebook, and the list is expanding.

The latest news, coming from a blog interview with Ryan King MyNoSQL is that Twitter is considering migrating their MySQL server clusters Cassandra DB.

This is a product presentation, which can be found in the same web Cassandra Project 

Update SQL Server table statistics dynamically throughout a database


In Oracle databases there is a table that allows to list all the tables in the database (table 'dba_tables') and we can use this 'dba_tables' to create maintenance scripts dynamically.  In SQL Server we can create also scripts of tables maintenace by querying the table [dataBase].dbo.sysobjects.

In the example below we use a T-SQL script to update statistics for all tables in a SQL Server database by querying dynamically the data dictionary (using the table dbo.sysobjects). This T-SQL code can be encapsulated in a stored procedure or in a job to be executed by the SQL Server Agent to automatically keep statistics updated on all tables of the dbo scheme in a SQL Server database.

The first dashboard applications compatible with Apple iPad


As expected, the first BI tools adapted to the new iPad of Apple are appearing. Nothing like a dashboard application to exploit the Multitouch screen possibilities of this device.

Prelytis is the company that has developed Prelytis LiveDashBoard, the first Business Intelligence software compatible with the new Apple Tablet. This is a dashboarding tool, and 2.0 oriented, with collaborative features, and remarkable for their efforts in terms of adaptation to mobile devices.

SAP joins trend with a 2.0 product integrated with Google BPM Wave


SAP TechED presented in Vienna on Gravity product prototype, a BPM solution that works on the collaborative environment provided by Google Wave.

This product allows Business Process desing collaboratively, building communication facilities provided by the environment Google Wave. Obviously it works for web, and can also be used from mobile devices like an iPhone.

This video shows how to simulate a situation of merger of two companies in which they must redefine many business processess the highest level.





Dataprix change the logo


Logotipo de DataprixAs many have noticed, we changed Dataprix logo.

Although the symbol of the head with the database during the previous logo reflected fairly well the spirit of Dataprix, and the relationship between data and knowledge, we believe it necessary to create a logo more 'professional', which will be the basis corporate image Dataprix.

Our primary corporate color is still blue, so the change on the web has not noticed too. We have changed the symbol for a 3D database, and D-shaped, to keep referring to the data, and lose the reference to knowledge that gave us the figure of a blue head, but you can not have everything in life ;).

In addition, we've heard from somewhere that may be better to simplify the meaning of the symbol to convey a clearer picture. For people not as technological areas, which need not recognize the symbol of a database could also be difficult to identify. On occasion I have come to discuss that on seeing the logo Dataprix seemed he might have something to do with a headache, or Bayer aspirin ..

Oracle 10g: Possible optimization in massive data dump

In batch runs to make a massive data dump into the same table using an INSERT or UPDATE for register within a block, the process can be optimized with the use of parameters (if client supports it) or if we use ODBC with bind variables.
Recall the steps taken by Oracle to process a query:
1) Sintactic Validation 
2) Semantic Validation
3) Optimization 
4) Generation of the QEP (Query Execution Plan)
5) Implementation of the QEP (Query Execution Plan)
Sentences can pick up the parameters by value (where salary > 1000) or once the sentence is compiled using Bind Variables (where salary>: b1). The advantage of the second option is that Oracle compile the sentence only one-time and reuses the compiled code for each of the values for the parameters.
But we must be aware because in the latter case because Oracle can't calculate the degree of selectivity of a query and, instead, apply a degree of selectivity by default (associated with each type of operation), which can give in wrong decisions.

Easily export data from Oracle to flat file


A simple way to export data from a query, table, etc.. of an oracle database to a flat file is to use the SPOOL command in SQLPlus. This would not need to rely on visual aids, which are not always available or do not always work as we want. Also you can use the Oracle format functions in the same SELECT statement that generated the data already in the format we need.

If, for example, we want to retrieve some data from all records in a table of customers sorted by date of discharge, simply open a SQLPlus session and run this series of commands: 

SQL> SPOOL C:\datos_de_clientes.txt
SQL> SELECT 'Cliente ' || CLI_NOMBRE || ', ' || CLI_NIF || '. Fecha alta: ' || TO_CHAR(CLI_FECHAALTA,'YYYY-MM-DD')

The first lines hide the headers that contain the field name, and do not concern us because we only want the data. Spool directs the output of data to the file 'datos_de_clientes.txt' on the C drive on the local machine.

The Time Dimension structure and Loading Procedure for MySQL


This post is based on this one of il-masacratore:Time: Dimension structure and loading script for SQLServer. 

As il-masacratore says usually there are a number of dimensions that are common to all DW. The Time dimension is one of them. 

Syndicate content