Últimas publicaciones en los blogs

DB2TOP register and replay

Actually DB2TOP is a real-time tool, however many DBA’s don’t know that it can be run in REPLAY mode with captured session information. So, next time you have a big crisis with your database, you can capture all the data from db2top and do the analysis afterwards.

How to capture the data? Option -C to capture. Press N to create a file. The option “-m” to gather data during n minutes.

DB2 with BLU Acceleration for SAP

Every time it’s more common SAP environments running with DB2.

Attached a video regarding the capabilities of DB2 with BLU Acceleration applied to SAP and all SAP notes to consider.

Surprising comparative with SAP HANA, and as with less complexity and investment, you get more performance.

Pay attention to the comparison with SAP HANA:

Attached also the IBM Redbook Architecting and Deploying DB2 with BLU Acceleration.

 

How to Protect Your Data in 2019

The number and variety of threats to your business and personal data are always increasing. In 2019, attackers have become so sophisticated that it can be a struggle to stay ahead of them. As many as a third of organizations cannot protect their data from cybersecurity threats.
Read on to learn about some of the common data security threats, as well as a number of tools and practices that can help you protect your data in the event of a breach..

The Benefits of Software Composition Analysis

Software Composition Analysis

It was only a few years ago when IT departments were able to manually handle vulnerabilities. Today, disruptive technologies and dynamic approaches to development have changed the digital landscape. Security is no longer the sole responsibility of the IT department. The borders of the security perimeter have been blurred, obscuring the visibility of digital spaces..

10 Tips for a Successful Cloud Migration

"Paradisiac cloud migration"

Migrating applications, data, and other workloads to the cloud is now commonplace among businesses of all sizes. A 2018 report on cloud computing trends revealed that 92 percent of companies now use public cloud services.
However, cloud migration remains a complex undertaking with several challenges, and getting it right is important. Read on to find out the challenges of cloud migration and ten tips for building a smart cloud migration plan..

Understanding Database Cloud Migration

Cloud computing is the new normal. Every time you are using a service or an application through an internet connection, you are using the cloud. Since most companies nowadays have part or all their databases hosted on the cloud, you ask how is the best way to do it. Migrating your database to the cloud can help you manage your workload by making your data available and easily scalable. Read on to learn about database cloud migration and tips to do it right..

DB2 Write Suspend

When doing a snapshot from a storage array, if the server contains a DB2 instance running, there is no certainty that the snapshot contains a consistent copy of the database.

To launch a snapshot and ensure consistent copy in DB2 is possible to put the database at “write suspend”, that is, it overrides the disk access in write mode, and work in the buffer pool memory. Queries whether it will record but writes are performed only in memory.

Average time of disk dccess read/write in DB2

Through DB2 we can get the average time in ms disk access is having DB2. These times are crucial for the detection of a IO problem with DB2 instance.

Usually we take into consideration that a value close to 2-3ms is good, more than 10ms can indicate problems.

Avg ms/write:

select trunc(decimal(sum(pool_write_time))/decimal(

(sum(pool_data_writes)+sum(pool_index_writes))),3)

from sysibmadm.snaptbsp

 

Avg ms/read:

Using Machine Learning/AI to Boost the Supply Chain: 5 Use Cases

Industry - AI to boost the Supply ChainThis article will discuss how supply chains are being improved through the use of innovative technologies before highlighting five uses of artificial intelligence and machine learning in supply chains.

When you finish reading, you’ll understand why many industry analysts have described A.I. technologies as disruptive innovations that have the potential to alter and improve operations across entire supply chains..

Open Source for Big Data: An Overview

Software Open SourceThis article will describe the relevance of open source software and big data before describing five interesting and useful open source big data tools and projects.

Big data workloads are those that involve the processing, storage, and analysis of large amounts of unstructured data to derive business value from that data. Traditional computing approaches and data processing software weren’t powerful enough to cope with big data, which typically inundates organizational IT systems on a daily basis.

The widespread adoption of Big Data analytics workloads over the past few years has been driven, in part, by the open source model, which has made frameworks, database programs, and other tools available to use and modify for those who want to delve into these big data workloads..

Defining the Difference between Deep Learning, Machine Learning, and Artificial Intelligence

Deep learning and Artificial IntelligenceThis article will discuss three currently trending technology topics, namely deep learning, machine learning, and artificial intelligence.

According to research, 80 percent of enterprises are actively investing in AI technologies. The machine learning market has a projected value of $8.81 billion by 2022, while the deep learning market size is expected to reach $10.2 billion by 2025...

What is Storage Tiering and How Can it Reduce Storage Costs?

Tiered storageTiered storage is a way of managing data by assigning it to different types of storage devices/media depending on the current value that the underlying information provides. The efficient management of data recognizes that all information provides an intrinsic value from the time it’s created to the time it becomes obsolete and that this value changes over the information lifecycle.

The typical factor determining the value of information is how frequently you access it, however, policy-based rules can factor a number of other issues to determine information value. For example, old bank transactions, which might have a low value, could suddenly shift in value depending on special circumstances, such as a tax audit. This article discusses some pros, cons, and best practices for tiered storage..

Software Testing

Software testing is the process to evaluate a software item to detect differences between a given input and expected output. It should be started during development process as you will reduce the possibilities of returning to development phase due to unexpected errors/missing functionalities.

Software testingIt meets two steps:

  • Verification: Software has the expected behavior.

  • Validation: Meets client requirements..

The Importance of Big Data Disaster Recovery

Analytics information big data Disaster recovery is a set of processes, techniques, and tools used to swiftly and smoothly recover vital IT infrastructure and data when an unforeseen event causes an outage.

The statistics tell the best story about the importance of disaster recovery—98 percent of organizations reported that a single hour of downtime costs over $100,000, while 81 percent indicated that an hour of downtime costs their business over $300,000...

Explainer videos

Explainer video is a short, engaging marketing video that highlights the services and solutions of a company or explains the essential features of a product. Various companies have also started to deploy explainer videos as a key part of internal communication and have used it for onboarding & training purposes and presentations.