DBA

connor_mc_d's picture

NULL’s vs NOT NULL’s and Performance

When it comes to giving the cost based optimiser the best possible chance to make the “right” decisions, many DBA’s are diligent in keeping statistics up to date, using histograms where appropriate, creating more indexes (or removing surplus indexes).

However one often neglected area is that the the null-ness of columns also impacts the optimiser decisions. NULL and NOT NULL do more than just act as constraints, they also add (or detract) to the value of indexes on those columns. Here’s an example of how the null-ness of a column impacts optimizer decisions. I have a table T which is a copy of DBA_OBJECTS, indexed on OBJECT_ID.

connor_mc_d's picture

Getting started…adding an account to use

If you’ve read my previous post about getting started with the Oracle database, then hopefully you now have your very own database installed and running, and you have a explored a little with the sample schemas using SQL Developer.  Perhaps now you want to venture out into your own database development, and for that, you will want to create your own user account and create your own tables.  Here’s another video which will guide you through the process.

connor_mc_d's picture

Clone a table

Sometimes doing a CREATE TABLE AS SELECT is all we need to copy the data from an existing table.  But what if we want more than that ?  What if we really want to clone that table to match the original as closely as possible.  We had a question along these lines on AskTOM today.  A standard CTAS copies the NOT NULL attributes and the data types, but not really much else.  We know that Data Pump will take care of it, but that is more complex than a simple CTAS.

So here is a simple routine to wrap the Data Pump calls so that the CTAS can be achieved with just as simple a command.  A database link pointing back to the same database is all we need.

dbakevlar's picture

The Future of the DBA, #C18LV, Video 1

I’m starting to move towards doing more videos and hope to improve my video skills, (and maybe add a dance sequence, ya know, like the hip kids…)  Check out this post and please, do add comments, ask questions or just tell me what you think?

Have an awesome Wednesday and no, don’t comment on my consistent need to make a strange face at the beginning of a video… </p />
</p></div>
    <div class=»

connor_mc_d's picture

Licensed for Advanced Compression? Don’t forget the network

We often think of Advanced Compression being exclusively about compressing data “at rest”, ie, on some sort of storage device.  And don’t get me wrong, if we consider just that part of Advanced Compression, that still covers a myriad of opportunities that could yield benefits for your databases and database applications:

  • Heat maps
  • Automatic Data Optimization
  • XML, JSON and LOB compression (including de-duplication)
  • Compression on backups
  • Compression on Data Pump files
  • Additional compression options on indexes and tables
  • Compressed Flashback Data Archive storage
  • Storage snapshot compression

However, if you are licensed for the option, there are other things that you can also take advantage of when it comes to compression of data on the network.

dbakevlar's picture

The 12 Days of Database Christmas

My brain has a tendency to wake up way before everything else in the house, so I try to keep it occupied best I’m able without disturbing anyone.  This may explain why so many if my plans are well flushed out, as I have a tendency to hash them out in the early morning hours as a way of letting the rest of the household sleep.  This morning, on the eve of Christmas, I may have let my brain offer a Database Administrator twist to an old Christmas favorite…

connor_mc_d's picture

iASH–my “infinite ASH” routine

I love Active Session History (ASH) data because a lot of the work I’ve done in my consulting life was “after the fact” diagnosis.  By this I mean that many of us have been in a similar circumstance where the customer will contact you not when a problem is occurring, but only when you contact them for some other potentially unrelated reason.  At which point you hear will that dreaded sentence:

“Yeah, the Order Entry screen was really slow a couple of hours ago

And this is where ASH is an awesome resource.  With the ASH data available, there is a good chance you will be able to diagnose the issue without having to make an embarrassing request for the customer to repeat the task so that you can trace the underlying database activity.  Because no-one likes to be the person that says:

“Yeah that performance must have really sucked for you … Hey, let’s do it again!”

dbakevlar's picture

Shooting the DBA isn’t a Silver Bullet to the Cloud

We’ve all been watching numerous companies view value in bypassing the Database Administrator and other critical IT roles in an effort to get IT faster to the cloud.  It may look incredibly attractive to sales, but the truth of it is, it can be like setting up land mines in your own yard.

Franck Pachot's picture

Am I a DBA 3.0 or just an SQL*DBA?

There are currently a lot of new buzz words and re-namings which suggest that our DBA role is changing, most of them escorted with a #cloud hashtag. Oracle Technology Network is now called Oracle Developer Community. Larry Ellison announced the database that does not need to be operated by humans. And people talking about the death of DBA, about the future of DBA, about DBA 3.0,…

connor_mc_d's picture

Interval partitioning just got better

Interval partitioning was a great feature when it arrived in version 11, because we no longer had to worry so much about ensuring partitions were available for new data when it arrived.  Partitions would just be created on the fly as required.  I’m not going to talk about interval partition in detail because there’s plenty of good content already out there.  But one key element for interval partitioning is that the intervals have to start from somewhere, which is why you always have to define a table with at least one partition.

 

image

 

Syndicate content