Search Results

Search found 2483 results on 100 pages for 'dave burton'.

Page 17/100 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • MySQL – Export the Resultset to CSV file

    - by Pinal Dave
    In SQL Server, you can use BCP command to export the result set to a csv file. In MySQL too, You can export data from a table or result set as a csv file in many methods. Here are two methods. Method 1 : Make use of Work Bench If you are using Work Bench as a querying tool, you can make use of it’s Export option in the result window. Run the following code in Work Bench SELECT db_names FROM mysql_testing; The result will be shown in the result windows. There is an option called “File”. Click on it and it will prompt you a window to save the result set (Screen shot attached to show how file option can be used). Choose the directory and type out the name of the file. Method 2 : Make use of OUTFILE command You can do the export using a query with OUTFILE command as shown below SELECT db_names FROM mysql_testing INTO OUTFILE 'C:/testing.csv' FIELDS ENCLOSED BY '"' TERMINATED BY ';' ESCAPED BY '"' LINES TERMINATED BY '\r\n'; After the execution of the above code, you can find a file named testing.csv in C drive of the server. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: MySQL, PostADay, SQL, SQL Authority, SQL Query, SQL Tips and Tricks, T SQL Tagged: CSV

    Read the article

  • MySQL – Grouping by Multiple Columns to Single Column as A String

    - by Pinal Dave
    In this post titled SQL SERVER – Grouping by Multiple Columns to Single Column as A String we have seen how to group multiple column data in comma separate values in a single row grouping by another column by using FOR XML clause. In this post we will see how we can produce the same result using the GROUP_CONCAT function in MySQL. Let us create the following table and data. CREATE TABLE TestTable (ID INT, Col VARCHAR(4)); INSERT INTO TestTable (ID, Col) SELECT 1, 'A' UNION ALL SELECT 1, 'B' UNION ALL SELECT 1, 'C' UNION ALL SELECT 2, 'A' UNION ALL SELECT 2, 'B' UNION ALL SELECT 2, 'C' UNION ALL SELECT 2, 'D' UNION ALL SELECT 2, 'E'; Now to generate csv values of the column col for each ID, use the following code SELECT ID, GROUP_CONCAT(col) AS CSV FROM TestTable GROUP BY ID; The result is ID CSV 1 A,B,C 2 A,B,C,D,E You can also change the delimiters. For example instead of comma, if you want to have a pipe symbol (|), use the following SELECT ID, REPLACE(GROUP_CONCAT(col),',','|') AS CSV FROM TestTable GROUP BY ID; The result is ID CSV 1 A|B|C 2 A|B|C|D|E MySQL makes this very simple with its support of GROUP_CONCAT function. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL SERVER – What is the Maximum Relational Database Size Supported by Single Instance?

    - by Pinal Dave
    I often get asked following question? “How much data SQL Server can handle?” Every single time when I get this question – I ask back following question - “How much data your storage system can handle?” The reason I ask this question back is because in reality for enterprise systems the limitation of storage is no more an issue. The Matter of the fact most of the database is now a days limited by the size of the storage system. SQL Server is enterprise system and it is very mature product. Even though if you still want to know what is the actual limit here is the answer. SQL Server 2008R2, 2012 and 2014 have maximum capacity of 524 PB (Petabyte) in the Enterprise, BI and Standard edition. SQL Server Express has a limitation of 10 GB due to its nature. I guess, now when you look at my question it will make sense that it is all depending on the size of your storage system. I personally believe at this point of time 524 PB is quite a huge data, but we never know after 10 years when we read this blog post, we all may think what was I thinking actually. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL – Download FREE Book – Data Access for HighlyScalable Solutions: Using SQL, NoSQL, and Polyglot Persistence

    - by Pinal Dave
    Recently I was preparing for Big Data and I ended up on very interesting read for everybody. This is created by Microsoft and it is indeed a fantastic read as per my opinion. It took me some time to read this entire book but it was worth reading this as it tried to answer two of the very interesting questions related to muscle. Here is the abstract from the book: Organizations seeking to use a NoSQL database are therefore faced with a twofold challenge: • Which NoSQL database(s) best meet(s) the needs of the organization? • How does an organization integrate a NoSQL database into its solutions? As I keep on reading the book, I find it very interesting and informative. I suggest if you have time this weekend, download the book and read it. This guide focuses on the most common types of NoSQL database currently available, describes the situations for which they are most suited, and shows examples of how you might incorporate them into a business application. The guide summarizes the experiences of a fictitious organization named Adventure Works, who implemented a solution that comprised an assortment of different databases. Download Data Access for HighlyScalable Solutions:  Using SQL, NoSQL,  and Polyglot Persistence While we are talking about Big Data and NoSQL do not forget to check out my tomorrow’s blog as I am going to talk about the same subject and it will be very interesting. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, NoSQL, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL SERVER – SELECT INTO with FileGroup or Partitionis Not Possible

    - by Pinal Dave
    The other day, I received an email from user and after a long time before I answer the question, I had to check the answer online. Here is the question - I want to create a new table based from old table, but when I execute following script it gives me an error. Is there anything I am missing in my syntax? SELECT *  INTO NewTableName ON MyFileGroup FROM MyOldTableName I faintly remember that this was not possible in earlier version of SQL Server but I was not sure if this feature was added in the recent versions or not. I quickly tried few syntaxes and referred online documentation and learned that it is still not possible in the latest version of SQL Server. The alternative is to just go ahead and change the default filegroup of any new table with following script. Though, I do not like change the default filegroup for new tables. It is possible that when I have changed the default filegroup some other code executes behind the scene by automated system or my colleague, it will be also created on new filegroup. ALTER DATABASE DatabaseName MODIFY FILEGROUP NameofFileGroup DEFAULT The reason this feature is not supported is that SELCT INTO is minimally logged operation. I seriously hope that some day in the future this feature get added in. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Filegroup

    Read the article

  • Ubuntu doesn't boot due to GRUB-Problems

    - by Dave
    Users out there, I came here with the spark of a hope, that you could help me. I want to get rid of my old WinXP, because the Game-Support for it seems to slowly expire now... So I took a second drive, just an old empty one I had at hands (ATA-Maxtor 90648D3), plugged of the other drive with WinXP, so that it couldn't be harmed, and started the installationof Ubuntu 12.04. Everything went as it was supposed to, until the end. Normal shutdown after successful installation process. But when I tried to boot my new Ubuntu from the HDD, it said: error: out of disk. grub rescue> So, what to do now? I already tried a lot of things in the terminal, e.g. the update-grub as mentioned on http://opensource-sidh.blogspot.de/2011/06/recover-grub-live-ubuntu-cd-pendrive.html. Everything worked, he didn't complain about a missing data or anything, but at the end of the day he still wasn't able to boot! Next step was to change the etc/default/grub-file, so that it could load the ATA-drivers first, so that there is now problem with my drive. But even this didn't seem to have any effect, I'm still stuck with Ubuntu in the Live-CD-Mode... If there was anybody to help me out there, I would be very glad. Thanks for any support, Dave P.S.: I even tried to fix it with boot-repair, a small tool for Ubuntu, and it created a file with data that could probably help you to help me. You can find it on http://paste.ubuntu.com/1428022/

    Read the article

  • SQL – What is the latest Version of NuoDB? – A Quick Contest to Get Amazon Gift Cards

    - by Pinal Dave
    We had a great contest earlier last week - What ACID stands in the Database? – Contest to Win 24 Amazon Gift Cards and Joes 2 Pros 2012 Kit. It has received quite a few responses. Just like any other contest, not everyone was winner. The kind folks at NuoDB decided to give another chance to everyone who have not won in the last contest. This means if you have missed to take part in the earlier contest or if you have taken part and not won, you still have one more chance to win Amazon Gift Card. Here is the quick contest: You just have to go and download NuoDB. The first 10 people who will download the NuoDB will get 10 – USD 10 cards. Remaining everyone will be entered into a lucky draw of Amazon Gift cards of USD 50. Winners will be announced in next 24 hours. Bonus Round: If you have entered in the contest above, you can also enter to win latest Beginning SSRS Joes 2 Pros book. You just have to leave a comment over here with your experience about your experience with NuoDB and what is the latest version of the product. Here are few of the blog post I wrote earlier on that subject: Part 1 – Install NuoDB in 90 Seconds Part 2 – Manage NuoDB Installation Part 3 – Explore NuoDB Database Part 4 – Migrate from SQL Server to NuoDB Part 5 - NuoDB and Third Party Explorer – SQuirreL SQL Client, SQL Workbench/J and DbVisualizer Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • MySQL – Introduction to CONCAT and CONCAT_WS functions

    - by Pinal Dave
    MySQL supports two types of concatenation functions. They are CONCAT and CONCAT_WS CONCAT function just concats all the argument values as such SELECT CONCAT('Television','Mobile','Furniture'); The above code returns the following TelevisionMobileFurniture If you want to concatenate them with a comma, either you need to specify the comma at the end of each value, or pass comma as an argument along with the values SELECT CONCAT('Television,','Mobile,','Furniture'); SELECT CONCAT('Television',',','Mobile',',','Furniture'); Both the above return the following Television,Mobile,Furniture However you can omit the extra work by using CONCAT_WS function. It stands for Concatenate with separator. This is very similar to CONCAT function, but accepts separator as the first argument. SELECT CONCAT_WS(',','Television','Mobile','Furniture'); The result is Television,Mobile,Furniture If you want pipeline as a separator, you can use SELECT CONCAT_WS('|','Television','Mobile','Furniture'); The result is Television|Mobile|Furniture So CONCAT_WS is very flexible in concatenating values along with separate. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: MySQL, PostADay, SQL, SQL Authority, SQL Query, SQL Tips and Tricks, T SQL

    Read the article

  • Fan always on and overheating problems on HP G62

    - by Dave
    I have a HP G62 with i5 processor, 4G Ram and ATI HD 5470. Started using Ubuntu 4 months ago, first was 11.10 and after 12.04. Everything worked perfectly until some weeks ago when my cpu fan became "crazy" and I mean by this always on, noise and overheating problems, for sure if I put a glass with water in some minutes will boil (and I mean it). There can be a bug in the last kernel or any other Ubuntu's official update? And by the way CPU fan is cleaned like a new born baby; Used jupiter and all those stuff (lm-sensor, psensor...) - same problem; Don't ask me to install ATI drivers because will work like sh#t (interface laggy as HD 5470 isn't supported by Ubuntu) and I can't even open ATI Catalyst (error when I try to open it), more than this I always used open source drivers and as I said, worked perfectly (3d, effects). Right now I'm back to Windows and everything is working perfectly but my main problem is that I get used with Ubuntu and kinda weird to use windows again. I want back my nice and almost perfect Ubuntu. Thanks, Dave

    Read the article

  • PDFtk Password Protection Help

    - by Dave W.
    I am using Ubuntu 11.10 and am looking for a solution to password protect a bunch of pdf files in a directory in batch. I came across PDFtk and it looks like it might do what I need, but I've reviewed the command line PDFtk examples and can't figure out if there is a way to do it in batch without having to individually specify the output file name for every file. I'm hoping a command-line guru can take a look at the PDFtk syntax and tell me if there is some trick / command that will allow me to password protect a directory of pdf files (e.g., *.pdf) and overwrite the existing files using the same name, or consistently rename the individual output files without having to specify each output name individually. Here's a link to the PDFtk command line examples page: http://www.pdflabs.com/tools/pdftk-the-pdf-toolkit/ Thanks for your help. I think I've answered my own question. Here's a bash script that appears to do the trick. I'd welcome help evaluating why the code I've commented out doesn't work... #!/bin/bash # Created by Dave, 2012-02-23 # This script uses PDFtk to password protect every PDF file # in the directory specified. The script creates a directory named "protected_[DATE]" # to hold the password protected version of the files. # # I'm using the "user_pw" parameter, # which means no one will be able to open or view the file without # the password. # # PDFtk must be installed for this script to work. # # Usage: ./protect_with_pdftk.bsh [FILE(S)] # [FILE(S)] can use wildcard expansion (e.g., *.pdf) # This part isn't working.... ignore. The goal is to avoid errors if the # directory to be created already exists by only attempting to create # it if it doesn't exists # #TARGET_DIR="protected_$(date +%F)" #if [ -d "$TARGET_DIR" ] #then #echo # echo "$TARGET_DIR directory exists!" #else #echo # echo "$TARGET_DIR directory does not exist!" #fi # mkdir protected_$(date +%F) for i in *pdf ; do pdftk "$i" output "./protected_$(date +%F)/$i" user_pw [PASSWORD]; done echo "Complete. Output is in the directory: ./protected_$(date +%F)"

    Read the article

  • SQL Authority News – Download Microsoft SQL Server 2014 Feature Pack and Microsoft SQL Server Developer’s Edition

    - by Pinal Dave
    Yesterday I attended the SQL Server Community Launch in Bangalore and presented on Performing an effective Presentation. It was a fun presentation and people very well received it. No matter on what subject, I present, I always end up talking about SQL. Here are two of the questions I had received during the event. Q1) I want to install SQL Server on my development server, where can we get it for free or at an economical price (I do not have MSDN)? A1) If you are not going to use your server in a production environment, you can just get SQL Server Developer’s Edition and you can read more about it over here. Here is another favorite question which I keep on receiving it during the event. Q2) I already have SQL Server installed on my machine, what are different feature pack should I install and where can I get them from. A2) Just download and install Microsoft SQL Server 2014 Service Pack. Here is the link for downloading it. The Microsoft SQL Server 2014 Feature Pack is a collection of stand-alone packages which provide additional value for Microsoft SQL Server. It includes tool and components for Microsoft SQL Server 2014 and add-on providers for Microsoft SQL Server 2014. Here is the list of component this product contains: Microsoft SQL Server Backup to Windows Azure Tool Microsoft SQL Server Cloud Adapter Microsoft Kerberos Configuration Manager for Microsoft SQL Server Microsoft SQL Server 2014 Semantic Language Statistics Microsoft SQL Server Data-Tier Application Framework Microsoft SQL Server 2014 Transact-SQL Language Service Microsoft Windows PowerShell Extensions for Microsoft SQL Server 2014 Microsoft SQL Server 2014 Shared Management Objects Microsoft Command Line Utilities 11 for Microsoft SQL Server Microsoft ODBC Driver 11 for Microsoft SQL Server – Windows Microsoft JDBC Driver 4.0 for Microsoft SQL Server Microsoft Drivers 3.0 for PHP for Microsoft SQL Server Microsoft SQL Server 2014 Transact-SQL ScriptDom Microsoft SQL Server 2014 Transact-SQL Compiler Service Microsoft System CLR Types for Microsoft SQL Server 2014 Microsoft SQL Server 2014 Remote Blob Store SQL RBS codeplex samples page SQL Server Remote Blob Store blogs Microsoft SQL Server Service Broker External Activator for Microsoft SQL Server 2014 Microsoft OData Source for Microsoft SQL Server 2014 Microsoft Balanced Data Distributor for Microsoft SQL Server 2014 Microsoft Change Data Capture Designer and Service for Oracle by Attunity for Microsoft SQL Server 2014 Microsoft SQL Server 2014 Master Data Service Add-in for Microsoft Excel Microsoft SQL Server StreamInsight Microsoft Connector for SAP BW for Microsoft SQL Server 2014 Microsoft SQL Server Migration Assistant Microsoft SQL Server 2014 Upgrade Advisor Microsoft OLEDB Provider for DB2 v5.0 for Microsoft SQL Server 2014 Microsoft SQL Server 2014 PowerPivot for Microsoft SharePoint 2013 Microsoft SQL Server 2014 ADOMD.NET Microsoft Analysis Services OLE DB Provider for Microsoft SQL Server 2014 Microsoft SQL Server 2014 Analysis Management Objects Microsoft SQL Server Report Builder for Microsoft SQL Server 2014 Microsoft SQL Server 2014 Reporting Services Add-in for Microsoft SharePoint Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL

    Read the article

  • Developer’s Life – Every Developer is a Captain America

    - by Pinal Dave
    Captain America was first created as a comic book character in the 1940’s as a way to boost morale during World War II.  Aimed at a children’s audience, his legacy faded away when the war ended.  However, he has recently has a major reboot to become a popular movie character that deals with modern issues. When Captain America was first written, there was no such thing as a developer, programmer or a computer (the way we think of them, anyway).  Despite these limitations, I think there are still a lot of ways that modern Captain America is like modern developers. So how are developers like Captain America? Well, read on my list of reasons. Take on Big Projects Captain America isn’t afraid to take on big projects – and takes responsibility when the project is co-opted by the evil organization HYDRA.  Developers may not have super villains out there corrupting their work, but they know to keep on top of their projects and own what they do. Elderly Wisdom Steve Rogers, Captain America’s alter ego, was frozen in ice for decades, and brought back to life to solve problems. Developers can learn from this by respecting the opinions of their elders – technology is an ever-changing market, but the old-timers still have a few tricks up their sleeves! Don’t be Afraid of Change Don’t be afraid of change.  Captain America woke up to find the world he was accustomed to is now completely different.  He might have even felt his skills were no longer necessary.  He, and developers, know that everyone has their place in a team, though.  If you try your best, you will make it work. Fight Your Own Battle Sometimes you have to make it on your own.  Captain America is an integral part of the Avengers, but in his own movies, the other superheroes aren’t around to back him up.  Developers, too, must learn to work both within and with out a team. Solid Integrity One of Captain America’s greatest qualities is his integrity.  His determine to do what is right, keep his word, and act honestly earns him mockery from some of the less-savory characters – even “good guys” like Iron Man.  Developers, and everyone else, need to develop the strength of character to keep their integrity.  No matter your walk of life, there will be tempting obstacles.  Think of Captain America, and say “no.” There is a lot for all of us to learn from Captain America, to take away in our own lives, and admire in those who display it – I am specifically thinking of developers.  If you are enjoying this series as much as I am, please let me know who else you would like to see featured. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Developer, Superhero

    Read the article

  • MySQL – How to Find mysqld.exe with Command Prompt – Fix: ‘mysql’ is not recognized as an internal or external command, operable program or batch file

    - by Pinal Dave
    One of the most popular question I get after watching my MySQL courses on Pluralsight is that beginning users are not able to find where they have installed MySQL Server. The error they receive is as follows when they type mysqld command on their default command line. ‘mysql‘ is not recognized as an internal or external command, operable program or batch file. This error comes up if user try to execute mysqld command on default command prompt. The user should execute this command where mysql.exe file exists.  If you are using Windows Explorer you can easily search on your drive mysqld.exe and find the location of the file and execute the above command there. However, if you want to find out with command prompt the location of mysqld.exe file you can follow the direction here. Step 1: Open a command prompt Open command prompt from Start >> Run >> cmd >> enter Step 2: Change directory You need to change the default directory to root directory, hence type cd\ command on the prompt to change the default directory to c:\ . Here we are assuming that you have installed MySQL on your c: drive. If you have installed it on any other drive change the drive to that letter. Step 3: Search Drive Type the command dir mysqld.exe /s /p on the command prompt. It will search your directories and will list the directory where mysqld.exe is located. Step 4: Change Directory Now once again change your command prompt file location to the folder where your mysqld.exe is located. In my case it is located here in folder C:\Program Files\MySQL\MySQL Server 5.6\bin hence I will run following command: cd C:\Program Files\MySQL\MySQL Server 5.6\bin . Step 5: Execute mysqld.exe Now you can once again mysqld.exe on your command prompt. You can use this method to search pretty much any file with the help of command prompt. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: MySQL, PostADay, SQL, SQL Authority, SQL Query, SQL Tips and Tricks, T SQL

    Read the article

  • Oracle Service Registry 11gR1 Support for Oracle Fusion Middleware/SOA Suite 11g PatchSet 2

    - by Dave Berry
    As you might be aware, a few days back we released Patchset 2 (PS2) for several products in the Oracle Fusion Middleware 11g Release 1 stack including WebLogic Server and SOA Suite. Though there was no patchset released for Oracle Service Registry (OSR) 11g, being an integral part of Fusion Middleware & SOA, OSR 11g R1 ( 11.1.1.2 ) is fully certified with this release. Below is some recommended reading before installing OSR 11g with the new PS2 : OSR 11g R1 & SOA Suite 11g PS2 in a Shared WebLogic Domain If you intend to deploy OSR 11g in the same domain as the SOA Suite 11g, the primary recommendation is to install OSR 11g in its own Managed Server within the same Weblogic Domain as the SOA Suite, as the following diagram depicts : An important pre-requisite for this setup is to apply Patch 9499508, after installation. It basically replaces a registry library - wasp.jar - in the registry application deployed on your server, so as to enable co-deployment of OSR 11g & SOA Suite 11g in the same WLS Domain. The patch fixes a java.lang.LinkageError: loader constraint violation that appears in your OSR system log and is now available for download. The second, equally important, pre-requisite is to modify the setDomainEnv.sh/.cmd file for your WebLogic Domain to conditionally set the CLASSPATH so that the oracle.soa.fabric.jar library is not included in it for the Managed Server(s) hosting OSR 11g. Both these pre-requisites and other OSR 11g Topology Best Practices are covered in detail in the new Knowledge Base article Oracle Service Registry 11g Topology : Best Practices. Architecting an OSR 11g High Availability Setup Typically you would want to create a High Availability (HA) OSR 11g setup, especially on your production system. The following illustrates the recommended topology. The article, Hands-on Guide to Creating an Oracle Service Registry 11g High-Availability Setup on Oracle WebLogic Server 11g on OTN provides step-by-step instructions for creating such an active-active HA setup of multiple OSR 11g nodes with a Load Balancer in an Oracle WebLogic Server cluster environment. Additional Info The OSR Home Page on OTN is the hub for OSR and is regularly updated with latest information, articles, white papers etc. For further reading, this FAQ answers some common questions on OSR. The OSR Certification Matrix lists the Application Servers, Databases, Artifact Storage Tools, Web Browsers, IDEs, etc... that OSR 11g is certified against. If you hit any problems during OSR 11g installation, design time or runtime, the first place to look into is the logs. To find more details about which logs to check when & where, take a look at Where to find Oracle Service Registry Logs? Finally, if you have any questions or problems, there are various ways to reach us - on the SOA Governance forum on OTN, on the Community Forums or by contacting Oracle Support. Yogesh Sontakke and Dave Berry

    Read the article

  • Developer’s Life – Every Developer is a Spiderman

    - by Pinal Dave
    I have to admit, Spiderman is my favorite superhero.  The most recent movie recently was released in theaters, so it has been at the front of my mind for some time. Spiderman was my favorite superhero even before the latest movie came out, but of course I took my whole family to see the movie as soon as I could!  Every one of us loved it, including my daughter.  We all left the movie thinking how great it would be to be Spiderman.  So, with that in mind, I started thinking about how we are like Spiderman in our everyday lives, especially developers. Let me list some of the reasons why I think every developer is a Spiderman. We have special powers, just like a superhero.  There is a reason that when there are problems or emergencies, we get called in, just like a superhero!  Our powers might not be the ability to swing through skyscrapers on a web, our powers are our debugging abilities, but there are still similarities! Spiderman never gives up.  He might not be the strongest superhero, and the ability to shoot web from your wrists is a pretty cool power, it’s not as impressive as being able to fly, or be invisible, or turn into a hulking green monster.  Developers are also human.  We have cool abilities, but our true strength lies in our willingness to work hard, find solutions, and go above and beyond to solve problems. Spiderman and developers have “spidey sense.”  This is sort of a joke in the comics and movies as well – that Spiderman can just tell when something is about to go wrong, or when a villain is just around the corner.  Developers also have a spidey sense about when a server is about to crash (usually at midnight on a Saturday). Spiderman makes a great superhero because he doesn’t look like one.  Clark Kent is probably fooling no one, hiding his superhero persona behind glasses.  But Peter Parker actually does blend in.  Great developers also blend in.  When they do their job right, no one knows they were there at all. “With great power comes great responsibility.”  There is a joke about developers (sometimes we even tell the jokes) about how if they are unhappy, the server or databases might mysteriously develop problems.  The truth is, very few developers would do something to harm a company’s computer system – they take their job very seriously.  It is a big responsibility. These are just a few of the reasons why I love Spiderman, why I love being a developer, and why I think developers are the greatest.  Let me know other reasons you love Spiderman and developers, or if you can shoot webs from your wrists – I might have a job for you. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL SERVER – CTE can be Updated

    - by Pinal Dave
    Today I have received a fantastic email from Matthew Spieth. SQL Server expert from Ohio. He recently had a great conversation with his colleagues in the office and wanted to make sure that everybody who reads this blog knows about this little feature which is commonly confused. Here is his statement and we will start our story with Matthew’s own statement: “Users often confuse CTE with Temp Table but technically they both are different, CTE are like Views and they can be updated just like views.“ Very true statement from Matthew. I totally agree with what he is saying. Just like him, I have enough, time came across a situation when developers think CTE is like temp table. When you update temp table, it remains in the scope of the temp table and it does not propagate it to the table based on which temp table is built. However, this is not the case when it is about CTE, when you update CTE, it updates underlying table just like view does. Here is the working example of the same built by Matthew to illustrate this behavior. Check the value in the base table first. USE AdventureWorks2012; -- Check - The value in the base table is updated SELECT Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738'; Now let us build CTE with the same data. ;WITH CTEUpd(ProductID, Name, ProductNumber, Color) AS( SELECT ProductID, Name, ProductNumber, Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738') Now let us update CTE with following code. -- Update CTE UPDATE CTEUpd SET Color = 'Rainbow'; Now let us check the BASE table based on which the CTE was built. -- Check - The value in the base table is updated SELECT Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738'; That’s it! You can update CTE and it will update the base table. Here is the script which you should execute all together. USE AdventureWorks2012; -- Check - The value in the base table is updated SELECT Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738'; -- Build CTE ;WITH CTEUpd(ProductID, Name, ProductNumber, Color) AS( SELECT ProductID, Name, ProductNumber, Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738') -- Update CTE UPDATE CTEUpd SET Color = 'Rainbow'; -- Check - The value in the base table is updated SELECT Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738'; If you are aware of such scenario, do let me know and I will post this on my blog with due credit to you. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL View, T SQL Tagged: CTE

    Read the article

  • SQL SERVER – Implementing IF … THEN in SQL SERVER with CASE Statements

    - by Pinal Dave
    Here is the question I received the other day in email. “I have business logic in my .net code and we use lots of IF … ELSE logic in our code. I want to move the logic to Stored Procedure. How do I convert the logic of the IF…ELSE to T-SQL. Please help.” I have previously received this answer few times. As data grows the performance problems grows more as well. Here is the how you can convert the logic of IF…ELSE in to CASE statement of SQL Server. Here are few of the examples: Example 1: If you are logic is as following: IF -1 < 1 THEN ‘TRUE’ ELSE ‘FALSE’ You can just use CASE statement as follows: -- SQL Server 2008 and earlier version solution SELECT CASE WHEN -1 < 1 THEN 'TRUE' ELSE 'FALSE' END AS Result GO -- SQL Server 2012 solution SELECT IIF ( -1 < 1, 'TRUE', 'FALSE' ) AS Result; GO If you are interested further about how IIF of SQL Server 2012 works read the blog post which I have written earlier this year . Well, in our example the condition which we have used is pretty simple but in the real world the logic can very complex. Let us see two different methods of how we an do CASE statement when we have logic based on the column of the table. Example 2: If you are logic is as following: IF BusinessEntityID < 10 THEN FirstName ELSE IF BusinessEntityID > 10 THEN PersonType FROM Person.Person p You can convert the same in the T-SQL as follows: SELECT CASE WHEN BusinessEntityID < 10 THEN FirstName WHEN BusinessEntityID > 10 THEN PersonType END AS Col, BusinessEntityID, Title, PersonType FROM Person.Person p However, if your logic is based on multiple column and conditions are complicated, you can follow the example 3. Example 3: If you are logic is as following: IF BusinessEntityID < 10 THEN FirstName ELSE IF BusinessEntityID > 10 AND Title IS NOT NULL THEN PersonType ELSE IF Title = 'Mr.' THEN 'Mister' ELSE 'No Idea' FROM Person.Person p You can convert the same in the T-SQL as follows: SELECT CASE WHEN BusinessEntityID < 10 THEN FirstName WHEN BusinessEntityID > 10 AND Title IS NOT NULL THEN PersonType WHEN Title = 'Mr.' THEN 'Mister' ELSE 'No Idea' END AS Col, BusinessEntityID, Title, PersonType FROM Person.Person p I hope this solution is good enough to convert the IF…ELSE logic to CASE Statement in SQL Server. Let me know if you need further information about the same. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Function, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL – Building a High Traffic, Profitable Blog – A Unique Gift on Author’s Birthday

    - by Pinal Dave
    Every July 30th, I like to do something new. It is my birthday and I like to give gifts to everyone this day. Last year, at this time I had written an article A Year Older and 3 SQL Server Books and 3 Video Courses – 33. I had written a total of 3 books by that time and had published total of  3 Pluralsight courses. When I look back the year, I feel that I gave my best to last year. Sine Last July 30th, I have written 6 more books and 5 more video courses. The total is now 9 books and 8 video courses. It seems that I have been producing one new book or course every month since last July. Building a High Traffic, Profitable Blog Out of my 8 courses my favorite course is my latest course at Pluralsight. This course is about how to build a high traffic blog and monetize it. I have been blogging for over 7 years and there have been many hurdles and roadblocks but I have never stopped blogging any single day. There have been many instances when I felt I should just hit delete and remove my entire blog from the web but fortunately I had courage to stand by on my decisions. Well at the end, I kept on fighting through the difficult time and kept on blogging. Every day there was a lesson to learn and every day there was an issue to resolve. I never gave up and kept on building new content. Today after 7 years, when I look back there are many stories to tell. It was impossible to write down the stories so I decided to build a course based on my experience. In this course, I share all the best tricks to build a high traffic, profitable blog. When we talk about profit, people often talk about money but the reality is that profit is much bigger word than money. There are many different ways one can profit from their own blog. In this course, I discuss about all different ways about how you can be profitable by building a high traffic blog. I believe this course is for everybody who aspire to build a website or blog which gives them a profit.  Here are the major topics based out of this course. Introduction Techniques to Engage Blog Readers Social Media – Social Sharing and Social Networking Search Engine Optimization (SEO) Monetizing a Blog Frequently Asked Questions Checklists Personally I believe this is the best gift I can give to all of you my friends. Build a successful high traffic blog and monetize it. Here is the list of the all of my video courses and here is the list of all of the books. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: About Me, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Blogging

    Read the article

  • SQL SERVER – SQL in Sixty Seconds – Last Three Episodes – Need Your Opinion

    - by Pinal Dave
    I have been blogging for almost 7 years and building video content for around 2 years. After spending so much time on blogging and creating video, I have got the quite a good idea what people would like to read and what people like to watch. However, there is one thing, which I am constantly struggling after almost a year and I would like to get your opinion about it. Though, this may look very simple to you but it is very crucial to me and I would like to know your opinion about it. I have been building video almost every week for my SQL in Sixty Seconds series and it has been quite popular. So far on my YouTube Channel there are over 2600 subscribers and over 250K views. Here is my problem – there are about 50+ videos on SQL in Sixty Seconds Series but the not every video is popular. There are a few videos which are extremely popular and there are videos which are absolutely struggling to get even single view. I have yet not figured out what people would love to watch on this channel. I noticed lots of people watching various videos but hardly anyone leaving comments or suggestions. At the end of the blog posts associated with the SQL in Sixty Seconds, I always ask which video people would love to watch, but I get a very low response over there too. What I wonder is that why such a low engagement of viewers/readers on the video blog posts where as the channel is success and lots of people are watching the video? What do you think I should change in my video to increase the engagement? Here are my last three videos from SQL in Sixty Seconds channel and I would like to know your feedback. Remove Cached Login from SSMS Connect Dialog – SQL in Sixty Seconds #049 RESEED Identity Column in Database Table – SQL in Sixty Seconds #051 Puzzle SET ANSI_NULLS and Resultset – SQL in Sixty Seconds #052  The feedback which I will like the most will for sure get a special surprise gift for me. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • SQL SERVER – SQL Server High Availability Options – Notes from the Field #032

    - by Pinal Dave
    [Notes from Pinal]: When it is about High Availability or Disaster Recovery, I often see people getting confused. There are so many options available that when the user has to select what is the most optimal solution for their organization they are often confused. Most of the people even know the salient features of various options, but when they have to figure out one single option to use they are often not sure which option to use. I like to give ask my dear friend time all these kinds of complicated questions. He has a skill to make a complex subject very simple and easy to understand. Linchpin People are database coaches and wellness experts for a data driven world. In this 26th episode of the Notes from the Fields series database expert Tim Radney (partner at Linchpin People) explains in a very simple words the best High Availability Option for your SQL Server.  Working with SQL Server a common challenge we are faced with is providing the maximum uptime possible.  To meet these demands we have to design a solution to provide High Availability (HA). Microsoft SQL Server depending on your edition provides you with several options.  This could be database mirroring, log shipping, failover clusters, availability groups or replication. Each possible solution comes with pro’s and con’s.  Not anyone one solution fits all scenarios so understanding which solution meets which need is important.  As with anything IT related, you need to fully understand your requirements before trying to solution the problem.  When it comes to building an HA solution, you need to understand the risk your organization needs to mitigate the most. I have found that most are concerned about hardware failure and OS failures. Other common concerns are data corruption or storage issues.  For data corruption or storage issues you can mitigate those concerns by having a second copy of the databases. That can be accomplished with database mirroring, log shipping, replication or availability groups with a secondary replica.  Failover clustering and virtualization with shared storage do not provide redundancy of the data. I recently created a chart outlining some pros and cons of each of the technologies that I posted on my blog. I like to use this chart to help illustrate how each technology provides a certain number of benefits.  Each of these solutions carries with it some level of cost and complexity.  As a database professional we should all be familiar with these technologies so we can make the best possible choice for our organization. If you want me to take a look at your server and its settings, or if your server is facing any issue we can Fix Your SQL Server. Note: Tim has also written an excellent book on SQL Backup and Recovery, a must have for everyone. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Notes from the Field, PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Shrinking Database

    Read the article

  • Big Data – Buzz Words: What is NoSQL – Day 5 of 21

    - by Pinal Dave
    In yesterday’s blog post we explored the basic architecture of Big Data . In this article we will take a quick look at one of the four most important buzz words which goes around Big Data – NoSQL. What is NoSQL? NoSQL stands for Not Relational SQL or Not Only SQL. Lots of people think that NoSQL means there is No SQL, which is not true – they both sound same but the meaning is totally different. NoSQL does use SQL but it uses more than SQL to achieve its goal. As per Wikipedia’s NoSQL Database Definition – “A NoSQL database provides a mechanism for storage and retrieval of data that uses looser consistency models than traditional relational databases.“ Why use NoSQL? A traditional relation database usually deals with predictable structured data. Whereas as the world has moved forward with unstructured data we often see the limitations of the traditional relational database in dealing with them. For example, nowadays we have data in format of SMS, wave files, photos and video format. It is a bit difficult to manage them by using a traditional relational database. I often see people using BLOB filed to store such a data. BLOB can store the data but when we have to retrieve them or even process them the same BLOB is extremely slow in processing the unstructured data. A NoSQL database is the type of database that can handle unstructured, unorganized and unpredictable data that our business needs it. Along with the support to unstructured data, the other advantage of NoSQL Database is high performance and high availability. Eventual Consistency Additionally to note that NoSQL Database may not provided 100% ACID (Atomicity, Consistency, Isolation, Durability) compliance.  Though, NoSQL Database does not support ACID they provide eventual consistency. That means over the long period of time all updates can be expected to propagate eventually through the system and data will be consistent. Taxonomy Taxonomy is the practice of classification of things or concepts and the principles. The NoSQL taxonomy supports column store, document store, key-value stores, and graph databases. We will discuss the taxonomy in detail in later blog posts. Here are few of the examples of the each of the No SQL Category. Column: Hbase, Cassandra, Accumulo Document: MongoDB, Couchbase, Raven Key-value : Dynamo, Riak, Azure, Redis, Cache, GT.m Graph: Neo4J, Allegro, Virtuoso, Bigdata As of now there are over 150 NoSQL Database and you can read everything about them in this single link. Tomorrow In tomorrow’s blog post we will discuss Buzz Word – Hadoop. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Silverlight Cream for May 06, 2010 -- #857

    - by Dave Campbell
    In this Issue: Alan Beasley, Josh Twist, Mike Snow(-2-, -3-), John Papa(-2-), David Kelley, and David Anson(-2-). Shoutout: John Papa posted a question: Do You Want be on Silverlight TV? From SilverlightCream.com: ListBox Styling (Part 3 - Additional Templates) in Expression Blend & Silverlight Alan Beasley has part 3 of his ListBox styling tutorial in Expression Blend up... another great tutorial and all the code. Securing Your Silverlight Applications Josh Twist has a nice long post up on Securing your Silverlight apps... definitions, services, various forms of authentication. Silverlight Tip of the Day #13 – Silverlight Mobile Development Mike Snow has Tip of the Day #13 up and is discussing creating Silverlight apps for WP7. Silverlight Tip of the Day #14 – Dynamically Loading a Control from a DLL on a Server Mike Snow's Tip #14 is step-by-step instructions for loading a UserControl from a DLL. Silverlight Tip of the Day #15 – Setting Default Browse in Visual Studio Mike Snow's Tip #15 is actually a Visual Studio tip -- how to set what browser your Silverlight app will launch in. Silverlight TV 24: eBay’s Silverlight 4 Simple Lister Application Here we are with Silverlight TV Thursday again! ... John Papa is interviewing Dave Wolf talking about the eBay Simple Lister app. Digitally Signing a XAP Silverlight John Papa has a post up about Digitally signing a Silverlight XAP. He actually is posting an excerpt from the Silverlight 4 Whitepaper he posted... and he has a link to the Whitepaper so we can all read the whole thing too! Hacking Silverlight Code Browser David Kelley has a very cool code browser up to keep track of all the snippets he uses... and we can too... this is a tremendous resource... thanks David! Simple workarounds for a visual problem when toggling a ContextMenu MenuItem's IsEnabled property directly David Anson dug into a ContextMenu problem reported by a couple readers and found a way to duplicate the problem plus a workaround while you're waiting for the next Toolkit drop. Upgraded my Windows Phone 7 Charting example to go with the April Developer Tools Refresh David Anson also has a post up describing his path from the previous WP7 code to the current upgrading his charting code. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • HP blade server: How many connection can be made between HP new gen 8 blades and an interconnect

    - by Dave T
    I am building a virtualized network on an HP C3000 with 460c Gen 8 blades and 2 HP L3 switch interconnects. I was advised to by a 1Gb 4-port 366M Mezzanine Adapter. That provides me 6 ethernet connections to each blade. I have been told that you can only make 2 connections to from each blade to each interconnect, but since I have to interconnectes and 6 ports I hope someone can tell me if I can make 3 connections from each server to each interconnect. I looking for the actual - thanks Dave

    Read the article

  • Big Data – Interacting with Hadoop – What is PIG? – What is PIG Latin? – Day 16 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the HIVE in Big Data Story. In this article we will understand what is PIG and PIG Latin in Big Data Story. Yahoo started working on Pig for their application deployment on Hadoop. The goal of Yahoo to manage their unstructured data. What is Pig and What is Pig Latin? Pig is a high level platform for creating MapReduce programs used with Hadoop and the language we use for this platform is called PIG Latin. The pig was designed to make Hadoop more user-friendly and approachable by power-users and nondevelopers. PIG is an interactive execution environment supporting Pig Latin language. The language Pig Latin has supported loading and processing of input data with series of transforming to produce desired results. PIG has two different execution environments 1) Local Mode – In this case all the scripts run on a single machine. 2) Hadoop – In this case all the scripts run on Hadoop Cluster. Pig Latin vs SQL Pig essentially creates set of map and reduce jobs under the hoods. Due to same users does not have to now write, compile and build solution for Big Data. The pig is very similar to SQL in many ways. The Ping Latin language provide an abstraction layer over the data. It focuses on the data and not the structure under the hood. Pig Latin is a very powerful language and it can do various operations like loading and storing data, streaming data, filtering data as well various data operations related to strings. The major difference between SQL and Pig Latin is that PIG is procedural and SQL is declarative. In simpler words, Pig Latin is very similar to SQ Lexecution plan and that makes it much easier for programmers to build various processes. Whereas SQL handles trees naturally, Pig Latin follows directed acyclic graph (DAG). DAGs is used to model several different kinds of structures in mathematics and computer science. DAG Tomorrow In tomorrow’s blog post we will discuss about very important components of the Big Data Ecosystem – Zookeeper. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Keyboard shortcut to quickly jump to the URL address field in Firefox...

    - by James Burton
    Hi, in Firefox I very often want to quickly enter a new URL in the address field. Therefore it would be very nice to be able to quickly jump to the URL address field with a keyboard shortcut! Today I must move my mouse and place the cursor in that field and also ensure that the current address is selected so I can overwrite it when entering the new URL. Very annoying! I'm sure I'm not the first one to have this need so there is probably a shortcut or an extension that does this already, but I cannot find that information! Thanks in advance, /James

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >