Search Results

Search found 1024 results on 41 pages for 'dba'.

Page 24/41 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • Resetting Your Oracle User Password with SQL Developer

    - by thatjeffsmith
    There’s nothing more annoying than having to email, call, or log a support ticket to have one of your accounts reset. This is no less annoying in the Oracle database. Those pesky security folks have determined that your password should only be valid for X days, and your time is up. Time to reset the password! Except…you can’t log into the database to reset your password. What now? Wait a second, look at this nifty thing I see in SQL Developer: Right click on my connection, reset password not available! Why not? The JDBC Driver Doesn’t Support This Operation We can’t make this call over the Oracle JDBC layer, because it hasn’t been implemented. However our primary interface, OCI, does indeed support this. In order to use the Oracle Call Interface (OCI), you need to have an Oracle Client on your machine. The good news is that this is fairly easy to get going. The Instant Client will do. You have two options, the full or ‘Lite’ Instant Clients. If you want SQL*Plus and the other client tools, go for the full. If you just want the basic drivers, go for the Lite. Either of these is fine, but mind the bit level and version of Oracle! Make sure you get a 32 bit Instant Client if you run 32 bit SQL Developer or 64 bit if you run 64 Here’s the download link What, you didn’t believe me? Mind the version of Oracle too! You want to be at the same level or higher of the database you’re working with. You can use a 11.2.0.3 client with 11.2.0.1 database but not a 10gR2 client with 11gR2 database. Clear as mud? Download and Extract Put it where you want – Program Files is as good as place as any if you have the rights. When you’re done, copy that directory path you extracted the archive to, because we’re going to add it to your Windows PATH environment variable. The easiest way to find this in Windows 7 is to open the Start dialog and type ‘path’. In Windows 8 you’ll cast your spell and wave at your screen until something happens. I recommend you put it up front so we find our DLLs first. Now with that set, let’s start up SQL Developer. Check the Connection Context menu again Bingo! What happened there? SQL Developer looks to see if it can find the OCI resources. Guess where it looks? That’s right, the PATH. If it finds what it’s looking for, and confirms the bit level is right, it will activate the Reset Password option. We have a Preference to ‘force’ an OCI/THICK connection that gives you a few other edge case features, but you do not need to enable this to activate the Reset Password. Not necessary, but won’t hurt anything either. There are a few actual benefits to using OCI powered connections, but that’s beyond the scope of today’s blog post…to be continued. Ok, so we’re ready to go. Now, where was I again? Oh yeah, my password has expired… Right click on your connection and now choose ‘Reset Password’ You’ll need to know your existing password and select a new one that meets your databases’s security standards. I Need Another Option, This Ain’t Working! If you have another account in the database, you can use the DBA Panel to reset a user’s password, or of course you can spark up a SQL*Plus session and issue the ALTER USER JEFF IDENTIFIED BY _________; command – but you knew this already, yes? I need more help ‘installing’ the Instant Client, help! There are lots and lots of resources out there on this subject. But I also know from personal experience that many of you have problems getting this to ‘work.’ The key things to remember is to download the right bit level AND make sure the client install directory is in your path. I know many folks that will just ‘install’ the Instant Client directly to one of their ‘bin’ type directories. You can do that if you want, but I prefer the cleaner method. Of course if you lack admin privs to change the PATH variable, that might be your only option. Or you could do what the original ORA- message indicated and ‘contact your DBA.’

    Read the article

  • Create Device Reccieve SMS Parse To Text ( SMS Gateway )

    - by Chris Okyen
    I want to use a server as a device to run a script to parse a SMS text in the following way. I. The person types in a specific and special cell phone number (Similar to Facebook’s 32556 number used to post on your wall) II. The user types a text message. III. The user sends the text message. IV. The message is sent to some kind of Device (the server) or SMS Gateway and receives it. V. The thing described above that the message is sent to then parse the test message. I understand that these three question will mix Programming and Server Stuff and could reside here or at DBA.SE How would I make such a cell phone number (described in step I) that would be sent to the Device? How do I create the device that then would receive it? Finally, how do I Parse the text message?

    Read the article

  • Redehost Transforms Cloud & Hosting Services with MySQL Enterprise Edition

    - by Mat Keep
    RedeHost are one of Brazil's largest cloud computing and web hosting providers, with more than 60,000 customers and 52,000 web sites running on its infrastructure. As the company grew, Redehost needed to automate operations, such as system monitoring, making the operations team more proactive in solving problems. Redehost also sought to improve server uptime, robustness, and availability, especially during backup windows, when performance would often dip. To address the needs of the business, Redehost migrated from the community edition of MySQL to MySQL Enterprise Edition, which has delivered a host of benefits: - Pro-active database management and monitoring using MySQL Enterprise Monitor, enabling Redehost to fulfil customer SLAs. Using the Query Analyzer, Redehost were able to more rapidly identify slow queries, improving customer support - Quadrupled backup speed with MySQL Enterprise Backup, leading to faster data recovery and improved system availability - Reduced DBA overhead by 50% due to the improved support capabilities offered by MySQL Enterprise Edition. - Enabled infrastructure consolidation, avoiding unnecessary energy costs and premature hardware acquisition You can learn more from the full Redehost Case Study Also, take a look at the recently updated MySQL in the Cloud whitepaper for the latest developments that are making it even simpler and more efficient to develop and deploy new services with MySQL in the cloud

    Read the article

  • PASS Summit 2011 &ndash; Part III

    - by Tara Kizer
    Well we’re about a month past PASS Summit 2011, and yet I haven’t finished blogging my notes! Between work and home life, I haven’t been able to come up for air in a bit.  Now on to my notes… On Thursday of the PASS Summit 2011, I attended Klaus Aschenbrenner’s (blog|twitter) “Advanced SQL Server 2008 Troubleshooting”, Joe Webb’s (blog|twitter) “SQL Server Locking & Blocking Made Simple”, Kalen Delaney’s (blog|twitter) “What Happened? Exploring the Plan Cache”, and Paul Randal’s (blog|twitter) “More DBA Mythbusters”.  I think my head grew two times in size from the Thursday sessions.  Just WOW! I took a ton of notes in Klaus' session.  He took a deep dive into how to troubleshoot performance problems.  Here is how he goes about solving a performance problem: Start by checking the wait stats DMV System health Memory issues I/O issues I normally start with blocking and then hit the wait stats.  Here’s the wait stat query (Paul Randal’s) that I use when working on a performance problem.  He highlighted a few waits to be aware of such as WRITELOG (indicates IO subsystem problem), SOS_SCHEDULER_YIELD (indicates CPU problem), and PAGEIOLATCH_XX (indicates an IO subsystem problem or a buffer pool problem).  Regarding memory issues, Klaus recommended that as a bare minimum, one should set the “max server memory (MB)” in sp_configure to 2GB or 10% reserved for the OS (whichever comes first).  This is just a starting point though! Regarding I/O issues, Klaus talked about disk partition alignment, which can improve SQL I/O performance by up to 100%.  You should use 64kb for NTFS cluster, and it’s automatic in Windows 2008 R2. Joe’s locking and blocking presentation was a good session to really clear up the fog in my mind about locking.  One takeaway that I had no idea could be done was that you can set a timeout in T-SQL code view LOCK_TIMEOUT.  If you do this via the application, you should trap error 1222. Kalen’s session went into execution plans.  The minimum size of a plan is 24k.  This adds up fast especially if you have a lot of plans that don’t get reused much.  You can use sys.dm_exec_cached_plans to check how often a plan is being reused by checking the usecounts column.  She said that we can use DBCC FLUSHPROCINDB to clear out the stored procedure cache for a specific database.  I didn’t know we had this available, so this was great to hear.  This will be less intrusive when an emergency comes up where I’ve needed to run DBCC FREEPROCCACHE. Kalen said one should enable “optimize for ad hoc workloads” if you have an adhoc loc.  This stores only a 300-byte stub of the first plan, and if it gets run again, it’ll store the whole thing.  This helps with plan cache bloat.  I have a lot of systems that use prepared statements, and Kalen says we simulate those calls by using sp_executesql.  Cool! Paul did a series of posts last year to debunk various myths and misconceptions around SQL Server.  He continues to debunk things via “DBA Mythbusters”.  You can get a PDF of a bunch of these here.  One of the myths he went over is the number of tempdb data files that you should have.  Back in 2000, the recommendation was to have as many tempdb data files as there are CPU cores on your server.  This no longer holds true due to the numerous cores we have on our servers.  Paul says you should start out with 1/4 to 1/2 the number of cores and work your way up from there.  BUT!  Paul likes what Bob Ward (twitter) says on this topic: 8 or less cores –> set number of files equal to the number of cores Greater than 8 cores –> start with 8 files and increase in blocks of 4 One common myth out there is to set your MAXDOP to 1 for an OLTP workload with high CXPACKET waits.  Instead of that, dig deeper first.  Look for missing indexes, out-of-date statistics, increase the “cost threshold for parallelism” setting, and perhaps set MAXDOP at the query level.  Paul stressed that you should not plan a backup strategy but instead plan a restore strategy.  What are your recoverability requirements?  Once you know that, now plan out your backups. As Paul always does, he talked about DBCC CHECKDB.  He said how fabulous it is.  I didn’t want to interrupt the presentation, so after his session had ended, I asked Paul about the need to run DBCC CHECKDB on your mirror systems.  You could have data corruption occur at the mirror and not at the principal server.  If you aren’t checking for data corruption on your mirror systems, you could be failing over to a corrupt database in the case of a disaster or even a planned failover.  You can’t run DBCC CHECKDB against the mirrored database, but you can run it against a snapshot off the mirrored database.

    Read the article

  • Mythbusters &ndash; SQL Edition

    - by AjarnMark
    I love the Mythbusters television show.  That has to be one of the coolest jobs in the world…it involves investigation, problem solving, science, trial & error, searching for the truth, robotics and remote controls, and in the end, you usually get to blow stuff up.  How great is that?!  I know I’ll never forget the episode where they blew up a cement truck.  That was truly awesome. Well, perhaps not quite made for TV, but pretty cool nonetheless, Paul Randal (@PaulRandal) has been doing some SQL Server myth busting here in the month of April with his DBA Myth a Day series.  It starts with In-Flight Transactions Continue After a Failover.  Check it out!

    Read the article

  • T-SQL Tuesday - IO capacity planning

    - by Michael Zilberstein
    This post is my contribution to Adam Machanic's T-SQL Tuesday #004 , hosted this time by Mike Walsh . Being applicative DBA, I usually don't take part in discussions which storage to buy or how to configure it. My interaction with IO is usually via PerfMon. When somebody calls me asking why everything is suddenly so slow on database server, "disk queue length" or "average seconds per transfer" counters provide an overwhelming answer in 60-70% of such cases. Sometimes it can be...(read more)

    Read the article

  • I'm Seeing Red

    - by Grant Fritchey
    Hello World! My move into the world of Red Gate is more and more complete with my shiny, new, red, blog. The goal of this blog is not to compete with, or replace, my blog over at ScaryDBA. Instead, this blog is where I can share things I find about Red Gate products and services. I can talk about the things that we're doing at Red Gate. I can talk about the things I'm doing at Red Gate. In short, this is my Red Gate blog. I'm still the Scary DBA, but over here, I'm painted bright red (and no, I was promised that no pictures were taken of that process). So look for tips and suggestions about Red Gate products, methods to help you do your job better using one of our tools, and anything else I can think of or comment on that supports you and our excellent software.

    Read the article

  • DBaaS Online Forum - Now available on-demand

    - by Javier Puerta
    The Database-as-a-Service Online Forum  was originally broadcasted on Monday, October 21, 2013, at a US-timezones time. All the content of the forum is now available on-demand for customers and partners to watch and listen to. The content is available on demand here. Watch the on-demand forum to hear from analysts and experts on how companies are beginning to transform with Database as a Service, and learn the prescriptive steps your organization can take to design, deploy, and deliver Database as a Service today   Agenda  Keynote Carl Olofson, Research VP, IDC Juan Loaiza, Senior Vice President, Oracle Systems Technology Todd Kimbriel, Director, State of Texas, eGovernment Division Eric Zonneveld, Oracle Architect, KPN James Anthony, Technology Director, e-DBA Breakout 1: Design DBaaS Alan Levine, Senior Director, Oracle Enterprise Architects Breakout 2: Deploy DBaaS Michael Timpanaro-Perrotta, Director of Product Management, Oracle Breakout 3: Deliver DBaaS  Sudip Datta, Vice President of Product Management, Oracle Closing Session Michelle Malcher, IOUG President Juan Loaiza, Senior Vice President, Oracle Systems Technology

    Read the article

  • DBaaS Online Forum - Now available on-demand

    - by Javier Puerta
    The Database-as-a-Service Online Forum  was originally broadcasted on Monday, October 21, 2013, at a US-timezones time. All the content of the forum is now available on-demand for customers and partners to watch and listen to. The content is available on demand here. Watch the on-demand forum to hear from analysts and experts on how companies are beginning to transform with Database as a Service, and learn the prescriptive steps your organization can take to design, deploy, and deliver Database as a Service today   Agenda  Keynote Carl Olofson, Research VP, IDC Juan Loaiza, Senior Vice President, Oracle Systems Technology Todd Kimbriel, Director, State of Texas, eGovernment Division Eric Zonneveld, Oracle Architect, KPN James Anthony, Technology Director, e-DBA   Breakout 1: Design DBaaS Alan Levine, Senior Director, Oracle Enterprise Architects   Breakout 2: Deploy DBaaS Michael Timpanaro-Perrotta, Director of Product Management, Oracle   Breakout 3: Deliver DBaaS  Sudip Datta, Vice President of Product Management, Oracle   Closing Session Michelle Malcher, IOUG President Juan Loaiza, Senior Vice President, Oracle Systems Technology

    Read the article

  • The Basics of Project Management / Software Development

    - by Sam
    It suddenly struck me today that I have never developed any large application or worked with a team of programmers, and so am missing out a lot - both in terms of technical knowledge and the social-fun part of it. And I would like to rectify that - an idea is to start an open source group by training college students (for no charge) and developing some open source application with them. Please give me some basic advice on the whole process of how to (1) plan and (2) manage projects in a team. What new skill sets would you recommend? (I have read joel on software and 37 Signals, and got many insightful tips from them. But I'd like a little more technical knowledge ...) Background (freelancer, past 4+ years) - Computer engineer graphic / web designer online marketing moved on to programming in PHP, Perl, Python did Oracle DBA OCP training to understand DB's current self-assigned title - web application developer.

    Read the article

  • SQL Saturday #220 - Atlanta - Pre-Con Scholarship Winners!

    - by Most Valuable Yak (Rob Volk)
    A few weeks ago, AtlantaMDF offered scholarships for each of our upcoming Pre-conference sessions at SQL Saturday #220. We would like to congratulate the winners! David Thomas SQL Server Security http://sqlsecurity.eventbrite.com/ Vince Bible Surfing the Multicore Wave: Processors, Parallelism, and Performance http://surfmulticore.eventbrite.com/ Mostafa Maged Languages of BI http://languagesofbi.eventbrite.com/ Daphne Adams Practical Self-Service BI with PowerPivot for Excel http://selfservicebi.eventbrite.com/ Tim Lawrence The DBA Skills Upgrade Toolkit http://dbatoolkit.eventbrite.com/ Thanks to everyone who applied! And once again we must thank Idera's generous sponsorship, and the time and effort made by Bobby Dimmick (w|t) and Brian Kelley (w|t) of Midlands PASS for judging all the applicants. Don't forget, there's still time to attend the Pre-Cons on May 17, 2013! Click on the EventBrite links for more details and to register!

    Read the article

  • SQL Azure Data Sync

    - by kaleidoscope
    The Microsoft Sync Framework Power Pack for SQL Azure contains a series of components that improve the experience of synchronizing with SQL Azure. This includes runtime components that optimize performance and simplify the process of synchronizing with the cloud. SQL Azure Data Sync allows developers and DBA's to: · Link existing on-premises data stores to SQL Azure. · Create new applications in Windows Azure without abandoning existing on-premises applications. · Extend on-premises data to remote offices, retail stores and mobile workers via the cloud. · Take Windows Azure and SQL Azure based web application offline to provide an “Outlook like” cached-mode experience. The Microsoft Sync Framework Power Pack for SQL Azure is comprised of the following: · SqlAzureSyncProvider · Sql Azure Offline Visual Studio Plug-In · SQL Azure Data Sync Tool for SQL Server · New SQL Azure Events Automated Provisioning Geeta

    Read the article

  • SQL SERVER Fix : Error : 3117 : The log or differential backup cannot be restored because no files

    I received the following email from one of my readers.Dear Pinal,I am new to SQL Server and our regular DBA is on vacation. Our production database had some problem and I have just restored full database backup to production server. When I try to apply log back I am getting following error. I am sure, [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • We Are SQLFamily

    - by AllenMWhite
    On Monday, Tom LaRock ( b /@sqlrockstar) presented his #MemeMonday topic as What #SQLFamily Means To Me . The #sqlfamily hash tag is a relatively new one, but is amazingly appropriate. I've been working with relational databases for almost 20 years, and for most of that time I've been the lone DBA. The only one to set things up, explain how things work, fix the problems, make it go faster, etc., etc., yadda, yadda, yadda. I enjoy being 'the guy', but at the same time it gets hard. What if I'm wrong?...(read more)

    Read the article

  • White Paper on Parallel Processing

    - by Andrew Kelly
      I just ran across what I think is a newly published white paper on parallel  processing in SQL Server 2008 R2. The date is October 2010 but this is the first time I have seen it so I am not 100% sure how new it really is. Maybe you have seen it already but if not I recommend taking a look. So far I haven’t had time to read it extensively but from a cursory look it appears to be quite informative and this is one of the areas least understood by a lot of dba’s. It was authored by Don Pinto...(read more)

    Read the article

  • Extending Database-as-a-Service to Provision Databases with Application Data

    - by Nilesh A
    Oracle Enterprise Manager 12c Database as a Service (DBaaS) empowers Self Service/SSA Users to rapidly spawn databases on demand in cloud. The configuration and structure of provisioned databases depends on respective service template selected by Self Service user while requesting for database. In EM12c, the DBaaS Self Service/SSA Administrator has the option of hosting various service templates in service catalog and based on underlying DBCA templates.Many times provisioned databases require production scale data either for UAT, testing or development purpose and managing DBCA templates with data can be unwieldy. So, we need to populate the database using post deployment script option and without any additional work for the SSA Users. The SSA Administrator can automate this task in few easy steps. For details on how to setup DBaaS Self Service Portal refer to the DBaaS CookbookIn this article, I will list steps required to enable EM 12c DBaaS to provision databases with application data in two distinct ways using: 1) Data pump 2) Transportable tablespaces (TTS). The steps listed below are just examples of how to extend EM 12c DBaaS and you can even have your own method plugged in part of post deployment script option. Using Data Pump to populate databases These are the steps to be followed to implement extending DBaaS using Data Pump methodolgy: Production DBA should run data pump export on the production database and make the dump file available to all the servers participating in the database zone [sample shown in Fig.1] -- Full exportexpdp FULL=y DUMPFILE=data_pump_dir:dpfull1%U.dmp, data_pump_dir:dpfull2%U.dmp PARALLEL=4 LOGFILE=data_pump_dir:dpexpfull.log JOB_NAME=dpexpfull Figure-1:  Full export of database using data pump Create a post deployment SQL script [sample shown in Fig. 2] and this script can either be uploaded into the software library by SSA Administrator or made available on a shared location accessible from servers where databases are likely to be provisioned Normal 0 -- Full importdeclare    h1   NUMBER;begin-- Creating the directory object where source database dump is backed up.    execute immediate 'create directory DEST_LOC as''/scratch/nagrawal/OracleHomes/oradata/INITCHNG/datafile''';-- Running import    h1 := dbms_datapump.open (operation => 'IMPORT', job_mode => 'FULL', job_name => 'DB_IMPORT10');    dbms_datapump.set_parallel(handle => h1, degree => 1);    dbms_datapump.add_file(handle => h1, filename => 'IMP_GRIDDB_FULL.LOG', directory => 'DATA_PUMP_DIR', filetype => 3);    dbms_datapump.add_file(handle => h1, filename => 'EXP_GRIDDB_FULL_%U.DMP', directory => 'DEST_LOC', filetype => 1);    dbms_datapump.start_job(handle => h1);    dbms_datapump.detach(handle => h1);end;/ Figure-2: Importing using data pump pl/sql procedures Using DBCA, create a template for the production database – include all the init.ora parameters, tablespaces, datafiles & their sizes SSA Administrator should customize “Create Database Deployment Procedure” and provide DBCA template created in the previous step. In “Additional Configuration Options” step of Customize “Create Database Deployment Procedure” flow, provide the name of the SQL script in the Custom Script section and lock the input (shown in Fig. 3). Continue saving the deployment procedure. Figure-3: Using Custom script option for calling Import SQL Now, an SSA user can login to Self Service Portal and use the flow to provision a database that will also  populate the data using the post deployment step. Using Transportable tablespaces to populate databases Copy of all user/application tablespaces will enable this method of populating databases. These are the required steps to extend DBaaS using transportable tablespaces: Production DBA needs to create a backup of tablespaces. Datafiles may need conversion [such as from Big Endian to Little Endian or vice versa] based on the platform of production and destination where DBaaS created the test database. Here is sample backup script shows how to find out if any conversion is required, describes the steps required to convert datafiles and backup tablespace. SSA Administrator should copy the database (tablespaces) backup datafiles and export dumps to the backup location accessible from the hosts participating in the database zone(s). Create a post deployment SQL script and this script can either be uploaded into the software library by SSA Administrator or made available on a shared location accessible from servers where databases are likely to be provisioned. Here is sample post deployment SQL script using transportable tablespaces. Using DBCA, create a template for the production database – all the init.ora parameters should be included. NOTE: DO NOT choose to bring tablespace data into this template as they will be created SSA Administrator should customize “Create Database Deployment Procedure” and provide DBCA template created in the previous step. In the “Additional Configuration Options” step of the flow, provide the name of the SQL script in the Custom Script section and lock the input. Continue saving the deployment procedure. Now, an SSA user can login to Self Service Portal and use the flow to provision a database that will also populate the data using the post deployment step. More Information: Database-as-a-Service on Exadata Cloud Podcast on Database as a Service using Oracle Enterprise Manager 12c Oracle Enterprise Manager 12c Installation and Administration guide, Cloud Administration guide DBaaS Cookbook Screenwatch: Private Database Cloud: Set Up the Cloud Self-Service Portal Screenwatch: Private Database Cloud: Use the Cloud Self-Service Portal Stay Connected: Twitter |  Face book |  You Tube |  Linked in |  Newsletter

    Read the article

  • Tales of a corrupt SQL log

    Warning: Im a simple dev, not an all powerful DBA with godly powers. This morning, one of my sites was down and DNN reported a problem with the database.  A quick series of tests revealed that the culprit was a corrupted log file. Easy fix I said, I have daily backups so its just a mater of restoring a good copy of the database and log files.  Well, I found out thats not exactly true.  You see, for this database, I have daily file backups and these are not database backups created...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Oracle12c ist da: Neue Features für Entwicker

    - by Carsten Czarski
    Das Warten hat ein Ende. Oracle12c Release 1 steht zum Download bereit. Oracle12c bringt eine Reihe neuer Funktionen für SQL, PL/SQL und APEX Entwickler mit. Mit SQL Pattern Matching, Identify Columns, Code Based Security seien nur drei Beispiele genannt. In unserem aktuellen Community Tipp stellen wir 12 neue Features für Entwickler vor - erfahren Sie, wie Sie mit Oracle12c noch schneller und effizienter entwickeln können. Automatische Sequences und Identity Columns SQL und PL/SQL: Erweiterungen und Verbesserungen PL/SQL: Rechte, Rollen und mehr Oracle Multitenant und APEX SQL Pattern Matching Wann ist die Zeile gültig: Valid Time Temporal : Bei den Kollegen der DBA Community finden Sie entsprechend eine Übersicht mit den für Administratoren und den Datenbankbetrieb interessanten Neuerungen.

    Read the article

  • SQL SERVER – ?Finding Out What Changed in a Deleted Database – Notes from the Field #041

    - by Pinal Dave
    [Note from Pinal]: This is a 41th episode of Notes from the Field series. The real world is full of challenges. When we are reading theory or book, we sometimes do not realize how real world reacts works and that is why we have the series notes from the field, which is extremely popular with developers and DBA. Let us talk about interesting problem of how to figure out what has changed in the DELETED database. Well, you think I am just throwing the words but in reality this kind of problems are making our DBA’s life interesting and in this blog post we have amazing story from Brian Kelley about the same subject. In this episode of the Notes from the Field series database expert Brian Kelley explains a how to find out what has changed in deleted database. Read the experience of Brian in his own words. Sometimes, one of the hardest questions to answer is, “What changed?” A similar question is, “Did anything change other than what we expected to change?” The First Place to Check – Schema Changes History Report: Pinal has recently written on the Schema Changes History report and its requirement for the Default Trace to be enabled. This is always the first place I look when I am trying to answer these questions. There are a couple of obvious limitations with the Schema Changes History report. First, while it reports what changed, when it changed, and who changed it, other than the base DDL operation (CREATE, ALTER, DELETE), it does not present what the changes actually were. This is not something covered by the default trace. Second, the default trace has a fixed size. When it hits that size, the changes begin to overwrite. As a result, if you wait too long, especially on a busy database server, you may find your changes rolled off. But the Database Has Been Deleted! Pinal cited another issue, and that’s the inability to run the Schema Changes History report if the database has been dropped. Thankfully, all is not lost. One thing to remember is that the Schema Changes History report is ultimately driven by the Default Trace. As you may have guess, it’s a trace, like any other database trace. And the Default Trace does write to disk. The trace files are written to the defined LOG directory for that SQL Server instance and have a prefix of log_: Therefore, you can read the trace files like any other. Tip: Copy the files to a working directory. Otherwise, you may occasionally receive a file in use error. With the Default Trace files, if you ask the question early enough, you can see the information for a deleted database just the same as any other database. Testing with a Deleted Database: Here’s a short script that will create a database, create a schema, create an object, and then drop the database. Without the database, you can’t do a standard Schema Changes History report. CREATE DATABASE DeleteMe; GO USE DeleteMe; GO CREATE SCHEMA Test AUTHORIZATION dbo; GO CREATE TABLE Test.Foo (FooID INT); GO USE MASTER; GO DROP DATABASE DeleteMe; GO This sets up the perfect situation where we can’t retrieve the information using the Schema Changes History report but where it’s still available. Finding the Information: I’ve sorted the columns so I can see the Event Subclass, the Start Time, the Database Name, the Object Name, and the Object Type at the front, but otherwise, I’m just looking at the trace files using SQL Profiler. As you can see, the information is definitely there: Therefore, even in the case of a dropped/deleted database, you can still determine who did what and when. You can even determine who dropped the database (loginame is captured). The key is to get the default trace files in a timely manner in order to extract the information. If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Notes from the Field, PostADay, SQL, SQL Authority, SQL Query, SQL Security, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Why Choose Oracle University?

    - by Shen Chen
    Get Specialized Training & Certification from ExpertOracle Instructors - with a 100% Satisfaction Guarantee. 5 Reasons to Enroll in Oracle Certification or Training 1. Career Growth: our curriculum is developed to teach you skills that directly align with real-life IT jobs. 2. Salary Advancement: Certification magazine found that Oracle Certified Professionals earn higher salaries when compared to other DBA or Developer professionals. 3. Expert Instruction: learn Oracle from the same team involved in Oracle product development. 4. Flexible Learning Options: train from anywhere, at any time. Pick the format that matches your learning style & schedule. 5. 100 % Student Satisfaction Guarantee: if you’re not completely satisfied with your training or certification experience, you can re-take the same class for free.

    Read the article

  • Problem installing Oracle 10g Express edition

    - by abhi
    I have installed Oracle 10g Express edition in Ubuntu 10.10 and it doesn't show or ask to give my password during installation. When I click on "start database" it gave the warning "Operation failed, abhi is not a member of 'dba' group." and when clink on "Run SQL command line" it shows /usr/lib/oracle/xe/app/oracle/product/10.2.0/server/bin/nls_lang.sh: 114: [[: not found When I click on "GOTO Database home page", it opens Mozilla's "Ubuntu Google search page". Can anyone help me get my Oracle to work or tell me how to remove that because it is also not getting removed. I tried all the commands that are given as suggestions by Ubuntu. Please help me because I don't want to format my Ubuntu.

    Read the article

  • Other SCOM users at SQLSaturday #65 Vancouver?

    - by merrillaldrich
    After a little hair-graying fun around passport renewal and family logistics, it looks like I'll be at the Vancouver SQLSaturday ! I am pumped. (I was entirely convinced they would call it "SQLSaturd' eh?" and I'm frankly a little disappointed about the name... :-) I'm on the tail end of a three-month deployment of System Center Operations Manager with the SQL management pack - if you are a DBA and SCOM user, too, I'd love to meet you and talk shop at the SQLSaturday event. Please drop me a line...(read more)

    Read the article

  • Intermittent Copy/Paste Problem in RDP

    - by Tara Kizer
    If you use RDP to remotely connect to your servers, you've probably encountered a clipboard issue where copy/paste stops working.  A quick Google search on the problem indicates you can easily fix the problem by logging out/logging back in or killing/restarting rdpclip.exe on the remote server.  Here's an article which covers this topic. But what do you do when copy/paste is intermittent?  It works one second, stops working for 5-30 seconds, and then on its own starts working again.  This is what’s occurring in our new non-production environment.  The DBA team is setting up 16 new physical servers and 5 new virtual machines.  I haven’t found a server where this ISN’T happening.  This intermittent copy/paste issue is driving me crazy!

    Read the article

  • Existing Instance, Shiny New Disks

    - by merrillaldrich
    Migrating an Instance of SQL Server to New Disks I get to do something pretty entertaining this week – migrate SQL instances on a 2008 cluster from one disk array to another! Zut alors! I am so excited I can hardly contain myself, so let’s get started. (Only a DBA could love this stuff, am I right? I know.) Anyway, here’s one method of many to migrate your data. Assumption : this is a host-based migration, which just means I’m using the Windows file system to push the data from one set of SAN disks...(read more)

    Read the article

  • LINQ Lycanthropy: Transformations into LINQ

    LINQ is one of the few technologies that you can start to use without a lot of preliminary learning. Also, it lends itself to learning by trying out examples. With Michael Sorens' help, you can watch as your conventional C# code changes to ravenous LINQ before your very eyes. Join SQL Backup’s 35,000+ customers to compress and strengthen your backups "SQL Backup will be a REAL boost to any DBA lucky enough to use it." Jonathan Allen. Download a free trial now.

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >