Search Results

Search found 5375 results on 215 pages for 'sqlserver 2005'.

Page 128/215 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • An MCM exam, Rob? Really?

    - by Rob Farley
    I took the SQL 2008 MCM Knowledge exam while in Seattle for the PASS Summit ten days ago. I wasn’t planning to do it, but I got persuaded to try. I was meaning to write this post to explain myself before the result came out, but it seems I didn’t get typing quickly enough. Those of you who know me will know I’m a big fan of certification, to a point. I’ve been involved with Microsoft Learning to help create exams. I’ve kept my certifications current since I first took an exam back in 1998, sitting many in beta, across quite a variety of topics. I’ve probably become quite good at them – I know I’ve definitely passed some that I really should’ve failed. I’ve also written that I don’t think exams are worth studying for. (That’s probably not entirely true, but it depends on your motivation. If you’re doing learning, I would encourage you to focus on what you need to know to do your job better. That will help you pass an exam – but the two skills are very different. I can coach someone on how to pass an exam, but that’s a different kind of teaching when compared to coaching someone about how to do a job. For example, the real world includes a lot of “it depends”, where you develop a feel for what the influencing factors might be. In an exam, its better to be able to know some of the “Don’t use this technology if XYZ is true” concepts better.) As for the Microsoft Certified Master certification… I’m not opposed to the idea of having the MCM (or in the future, MCSM) cert. But the barrier to entry feels quite high for me. When it was first introduced, the nearest testing centres to me were in Kuala Lumpur and Manila. Now there’s one in Perth, but that’s still a big effort. I know there are options in the US – such as one about an hour’s drive away from downtown Seattle, but it all just seems too hard. Plus, these exams are more expensive, and all up – I wasn’t sure I wanted to try them, particularly with the fact that I don’t like to study. I used to study for exams. It would drive my wife crazy. I’d have some exam scheduled for some time in the future (like the time I had two booked for two consecutive days at TechEd Australia 2005), and I’d make sure I was ready. Every waking moment would be spent pouring over exam material, and it wasn’t healthy. I got shaken out of that, though, when I ended up taking four exams in those two days in 2005 and passed them all. I also worked out that if I had a Second Shot available, then failing wasn’t a bad thing at all. Even without Second Shot, I’m much more okay about failing. But even just trying an MCM exam is a big effort. I wouldn’t want to fail one of them. Plus there’s the illusion to maintain. People have told me for a long time that I should just take the MCM exams – that I’d pass no problem. I’ve never been so sure. It was almost becoming a pride-point. Perhaps I should fail just to demonstrate that I can fail these things. Anyway – boB Taylor (@sqlboBT) persuaded me to try the SQL 2008 MCM Knowledge exam at the PASS Summit. They set up a testing centre in one of the room there, so it wasn’t out of my way at all. I had to squeeze it in between other commitments, and I certainly didn’t have time to even see what was on the syllabus, let alone study. In fact, I was so exhausted from the week that I fell asleep at least once (just for a moment though) during the actual exam. Perhaps the questions need more jokes, I’m not sure. I knew if I failed, then I might disappoint some people, but that I wouldn’t’ve spent a great deal of effort in trying to pass. On the other hand, if I did pass I’d then be under pressure to investigate the MCM Lab exam, which can be taken remotely (therefore, a much smaller amount of effort to make happen). In some ways, passing could end up just putting a bunch more pressure on me. Oh, and I did.

    Read the article

  • Ardour won't rewind when jack time master

    - by Edward
    Using Ubuntu Studio 12.04, ardour will not rewind when it is set to the jack time master. I've read that this could be due to a jack/ardour version conflict, but I am not sure what the correct combo should be. The same thing happens with "ardour 2.8.14 (built from revision 13065)" and "ardour 2.8.12 (built from revision 10144)". The latter is the default installation with ubuntu studio 12.04 LTS. Linux "/proc/version" reports as Linux version 3.2.0-23-lowlatency-pae (buildd@vernadsky) (gcc version 4.6.3 (Ubuntu/Linaro 4.6.3-1ubuntu4) ) #31-Ubuntu SMP PREEMPT Wed Apr 11 04:07:36 UTC 2012 and "jackd --version" reports as: jackdmp 1.9.8 Copyright 2001-2005 Paul Davis and others. Copyright 2004-2011 Grame. jackdmp comes with ABSOLUTELY NO WARRANTY This is free software, and you are welcome to redistribute it under certain conditions; see the file COPYING for details jackdmp version 1.9.8 tmpdir /dev/shm protocol 8 Thanks for any help.

    Read the article

  • Le SEO est-il mort ? Pas vraiment, répond l'agence de content-marketing Lobi : il est « mort-vivant »

    Le SEO est-il mort ? Pas vraiment, répond l'agence de content-marketing Lobi : il est « mort-vivant » Fondée en 2005, Lobi est une des premières agences françaises de content marketing on-line, . Pour elle, « le SEO (NDR Search Engine Optimization) est un mort-vivant ». « Ce n'est pas un scoop, écrit Thierry GILLMAN sur le blog de l'agence. Chaque nouvelle avancée dans l'algorithme de Google va dans le même sens : privilégier les contenus utiles et pénaliser les contenus bidons», c'est à dire ceux sensément reconnaissables parce que « sur-optimisés ». Conséquence du travail continu de Google sur son algorithme ? et du fait que 99% ...

    Read the article

  • MySpace est officiellement à vendre, mais quelle firme pourrait bien vouloir le racheter ?

    MySpace est officiellement à vendre, mais quelle firme pourrait bien vouloir le racheter ? Tout à une fin, et il semble bien que MySpace puisse en faire bientôt l'expérience. Le réseau social qui était autrefois leader, s'est vu balayé d'un revers de la main par l'arrivée de Facebook, vers lequel la majorité de ses abonnés ont migré. En 2005, c'était News Corporation (qui appartient à Rupert Murdoch), qui avait racheté le site à prix d'or (580 millions de dollars !). Depuis, sa relance est un échec et la dégringolade continue. Le groupe de presse a donc décidé de s'en séparer, et lui cherche désormais un nouveau propriétaire. "Avec un nouvel accent mis sur les contenus et une nouvelle structure, nous pens...

    Read the article

  • Why do we keep using CSV?

    - by Stephen
    Why do we keep using CSV? I recently made a shift to working the health domain and despite the wonderful work in data transfer standards, all data transfer is in CSV, both for reporting to external organisations, and for data migrations when implementing new systems. Unfortunately the use of CSV is the cause of the endless repetition of the same stupid errors, with the same waste of developer time. (bad escaping, failing to handle null fields etc.) I know we can do better, and anything between JSON and XML (depending on the instance) would be fine. (Most of the time this is data going from one MS SQLserver 2005 to another!) I feel as if each time I see this happening I am literally watching one developer waste anothers time. So why do we keep shafting each other? When will we stop?

    Read the article

  • Cannot set a credential for principal 'sa'

    - by hailey
    I was trying to change the SA password on my development server this morning and got an error. Msg 15535, Level 16, State 1, Line 1 Cannot set a credential for principal 'sa'. It was a little frustrating to get an error for a seemingly simple task but then agian maybe I screwed something up.  After doing a couple of searches i found a Microsoft KB (support.microsoft.com/kb/956177) "You receive an exception in SQL Server 2008 when you try to modify the properties of the SQL Server Administrator account by using SQL Server Management Studio".  It was for SQL 2008 but it worked for my SQL 2005 sp3 server just fine.  You have to click the Map to Credential check box but you don't have to add any credetials just click the OK button to complete and that's it.

    Read the article

  • No VB6 to VS2010 direct upgrade path

    - by Chris Williams
    From the "is this really news?" department... From looking at the currently available versions of 2010, there is no direct upgrade path from VB6 to VS2010. Anyone still using VB6 and wishing to upgrade to VS2010 has two options:  Use the upgrade tool from an earlier version of VS (like 2005 or 2008) and then run the upgrade in VS2010 to get the rest of the way... or rewrite your code. I'll leave it as an exercise to the reader which is the better option. I'd like to take a moment to point out the obvious: A) If you're still using VB6 at this point, you probably don't care about VS2010 compatibility. B) Running your code through 2 upgrade wizards isn't going to result in anything resembling best practices. C) Bemoaning the lack of support in 2010 for a 12 year old version of an extinct programming language helps nobody. This public service announcement is brought to you by the letter C. Thank you.

    Read the article

  • How to access HTML elements from server side code in an asp.net website

    - by nikolaosk
    In this post I will demonstrate with a hands on example how HTML elements in an .aspx page can be processed exactly like standard ASP.Net server controls. Basically how to make them accessible from server side code. 1) Launch Visual Studio 2010/2008/2005. (express editions will work fine). Create a new empty website and choose a suitable name for it. Choose VB as the development language. 2) Add a new item in your site, a web form. Leave the default name. 3) Let's say that we want to change the background...(read more)

    Read the article

  • What is the best practice with KML files when adding geositemap?

    - by Floran
    Im not sure how to deal with kml files. Now important particularly in reference to the Google Venice update. My site basically is a guide of many company listings (sort of Yellow Pages). I want each company listing to have a geolocation associated with it. Which of the options I present below is the way to go? OPTION 1: all locations in a single KML file with a reference to that KML file from a geositemap.xml MYGEOSITEMAP.xml <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:geo="http://www.google.com/geo/schemas/sitemap/1.0"> <url><loc>http://www.mysite.com/locations.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> </urlset> ALLLOCATIONS.kml <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document> <name>MyCompany</name> <atom:author> <atom:name>MyCompany</atom:name> </atom:author> <atom:link href="http://www.mysite.com/locations/3454/MyCompany" rel="related" /> <Placemark> <name>MyCompany, Kalverstraat 26 Amsterdam 1000AG</name> <description><![CDATA[<address><a href="http://www.mysite.com/locations/3454/MyCompany">MyCompany</a><br />Address: Kalverstraat 26, Amsterdam 1000AG <br />Phone: 0646598787</address><p>hello there, im MyCompany</p>]]> </description><Point><coordinates>5.420686499999965,51.6298808,0</coordinates> </Point> </Placemark> </Document> </kml> <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document> <name>MyCompany</name><atom:author><atom:name>MyCompany</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/22/companyX" rel="related" /><Placemark><name>MyCompany, Rosestreet 45 Amsterdam 1001XF </name><description><![CDATA[<address><a href="http://www.mysite.com/locations/22/companyX">companyX</a><br />Address: Rosestreet 45, Amsterdam 1001XF <br />Phone: 0642195493</address><p>some text about companyX</p>]]></description><Point><coordinates>5.520686499889632,51.6197705,0</coordinates></Point></Placemark> </Document> </kml> OPTION 2: a separate KML file for each location and a reference to each KML file from a geositemap.xml (kml files placed in a \kmlfiles folder) MYGEOSITEMAP.xml <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:geo="http://www.google.com/geo/schemas/sitemap/1.0"> <url><loc>http://www.mysite.com/kmlfiles/3454_MyCompany.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> <url><loc>http://www.mysite.com/kmlfiles/22_companyX.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> </urlset> *3454_MyCompany.kml* <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document><name>MyCompany</name><atom:author><atom:name>MyCompany</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/3454/MyCompany" rel="related" /><Placemark><name>MyCompany, Kalverstraat 26 Amsterdam 1000AG</name><description><![CDATA[<address><a href="http://www.mysite.com/locations/3454/MyCompany">MyCompany</a><br />Address: Kalverstraat 26, Amsterdam 1000AG <br />Phone: 0646598787</address><p>hello there, im MyCompany</p>]]></description><Point><coordinates>5.420686499999965,51.6298808,0</coordinates></Point></Placemark> </Document> </kml> *22_companyX.kml* <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document><name>companyX</name><atom:author><atom:name>companyX</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/22/companyX" rel="related" /><Placemark><name>companyX, Rosestreet 45 Amsterdam 1001XF </name><description><![CDATA[<address><a href="http://www.mysite.com/locations/22/companyX">companyX</a><br />Address: Rosestreet 45, Amsterdam 1001XF <br />Phone: 0642195493</address><p>some text about companyX</p>]]></description><Point><coordinates>5.520686499889632,51.6197705,0</coordinates></Point></Placemark> </Document> </kml> OPTION 3?

    Read the article

  • Feature Activation Error(Cannot Add Dll to GAC)

    - by sbtahir
    Hi.. we have created a feature ,which have got 2 application pages. one is to activate the user control and the other one is for database configuration,for the database configuration the user have to give the .mdf and .ldf files to restore the database. For the restoration of database we have used microsoft.sqlserver.replication.dll, the feature is working fine, but when we deploy it on any other machine, at the time of activation it gives error, and the error is: Error: Cannot add the specified assembly to the global assembly cache: Microsoft.SqlServer.Replication.dll. Does anyone know how to solve this? Thanks SAAD

    Read the article

  • Oracle Hyperion si conferma leader nel Magic Quadrant Gartner 2012

    - by Andrea Cravero
    L'edizione 2012 del Gartner Magic Quadrant for Corporate Performance Management Suites conferma la leadership Oracle Hyperion, che dura ininterrotta dal 2005. Secondo Gartner, "Oracle is a Leader in CPM suites, with one of the most widely distributed solutions in the market. Oracle Hyperion Enterprise Performance Management is recognized by CFOs worldwide. The vendor has a well-established partner channel, with both large and smaller CPM SI specialists. Hyperion skills are also plentiful among the independent consultant community, given the well-established products." "Oracle continues to innovate, bringing incremental improvements across the portfolio as well as new financial close management, disclosure management and predictive planning additions. Furthermore, Oracle has improved integration of Hyperion with the Oracle BI platform, and has improved planning performance, enabling Hyperion Planning to use Oracle Exalytics In-Memory Machine." Il rapporto completo è disponibile qui: Gartner: Magic Quadrant for Corporate Performance Management Suites, 2012 Buona lettura!

    Read the article

  • Prologue…

    - by Chris Stewart
    Hi all,                 This is the beginning of my blog. I have been a software developer for going on 11 years using the Microsoft toolset (primarily VB 5, VB 6, VB.net and SQL Server 6, 7, 2000, 2005, 2008). My coding interests are C#, ASP.net, SQL and XNA. Here I will post my musings, things of interest, techie babble and sometimes random gibberish. My hope for this blog is to document my learning experiences with C#, help, encourage and in some way be useful to those who come after me. Thanks for reading!

    Read the article

  • Talend Enterprise Data Integration overperforms on Oracle SPARC T4

    - by Amir Javanshir
    The SPARC T microprocessor, released in 2005 by Sun Microsystems, and now continued at Oracle, has a good track record in parallel execution and multi-threaded performance. However it was less suited for pure single-threaded workloads. The new SPARC T4 processor is now filling that gap by offering a 5x better single-thread performance over previous generations. Following our long-term relationship with Talend, a fast growing ISV positioned by Gartner in the “Visionaries” quadrant of the “Magic Quadrant for Data Integration Tools”, we decided to test some of their integration components with the T4 chip, more precisely on a T4-1 system, in order to verify first hand if this new processor stands up to its promises. Several tests were performed, mainly focused on: Single-thread performance of the new SPARC T4 processor compared to an older SPARC T2+ processor Overall throughput of the SPARC T4-1 server using multiple threads The tests consisted in reading large amounts of data --ten's of gigabytes--, processing and writing them back to a file or an Oracle 11gR2 database table. They are CPU, memory and IO bound tests. Given the main focus of this project --CPU performance--, bottlenecks were removed as much as possible on the memory and IO sub-systems. When possible, the data to process was put into the ZFS filesystem cache, for instance. Also, two external storage devices were directly attached to the servers under test, each one divided in two ZFS pools for read and write operations. Multi-thread: Testing throughput on the Oracle T4-1 The tests were performed with different number of simultaneous threads (1, 2, 4, 8, 12, 16, 32, 48 and 64) and using different storage devices: Flash, Fibre Channel storage, two stripped internal disks and one single internal disk. All storage devices used ZFS as filesystem and volume management. Each thread read a dedicated 1GB-large file containing 12.5M lines with the following structure: customerID;FirstName;LastName;StreetAddress;City;State;Zip;Cust_Status;Since_DT;Status_DT 1;Ronald;Reagan;South Highway;Santa Fe;Montana;98756;A;04-06-2006;09-08-2008 2;Theodore;Roosevelt;Timberlane Drive;Columbus;Louisiana;75677;A;10-05-2009;27-05-2008 3;Andrew;Madison;S Rustle St;Santa Fe;Arkansas;75677;A;29-04-2005;09-02-2008 4;Dwight;Adams;South Roosevelt Drive;Baton Rouge;Vermont;75677;A;15-02-2004;26-01-2007 […] The following graphs present the results of our tests: Unsurprisingly up to 16 threads, all files fit in the ZFS cache a.k.a L2ARC : once the cache is hot there is no performance difference depending on the underlying storage. From 16 threads upwards however, it is clear that IO becomes a bottleneck, having a good IO subsystem is thus key. Single-disk performance collapses whereas the Sun F5100 and ST6180 arrays allow the T4-1 to scale quite seamlessly. From 32 to 64 threads, the performance is almost constant with just a slow decline. For the database load tests, only the best IO configuration --using external storage devices-- were used, hosting the Oracle table spaces and redo log files. Using the Sun Storage F5100 array allows the T4-1 server to scale up to 48 parallel JVM processes before saturating the CPU. The final result is a staggering 646K lines per second insertion in an Oracle table using 48 parallel threads. Single-thread: Testing the single thread performance Seven different tests were performed on both servers. Given the fact that only one thread, thus one file was read, no IO bottleneck was involved, all data being served from the ZFS cache. Read File ? Filter ? Write File: Read file, filter data, write the filtered data in a new file. The filter is set on the “Status” column: only lines with status set to “A” are selected. This limits each output file to about 500 MB. Read File ? Load Database Table: Read file, insert into a single Oracle table. Average: Read file, compute the average of a numeric column, write the result in a new file. Division & Square Root: Read file, perform a division and square root on a numeric column, write the result data in a new file. Oracle DB Dump: Dump the content of an Oracle table (12.5M rows) into a CSV file. Transform: Read file, transform, write the result data in a new file. The transformations applied are: set the address column to upper case and add an extra column at the end, which is the concatenation of two columns. Sort: Read file, sort a numeric and alpha numeric column, write the result data in a new file. The following table and graph present the final results of the tests: Throughput unit is thousand lines per second processed (K lines/second). Improvement is the % of improvement between the T5140 and T4-1. Test T4-1 (Time s.) T5140 (Time s.) Improvement T4-1 (Throughput) T5140 (Throughput) Read/Filter/Write 125 806 645% 100 16 Read/Load Database 195 1111 570% 64 11 Average 96 557 580% 130 22 Division & Square Root 161 1054 655% 78 12 Oracle DB Dump 164 945 576% 76 13 Transform 159 1124 707% 79 11 Sort 251 1336 532% 50 9 The improvement of single-thread performance is quite dramatic: depending on the tests, the T4 is between 5.4 to 7 times faster than the T2+. It seems clear that the SPARC T4 processor has gone a long way filling the gap in single-thread performance, without sacrifying the multi-threaded capability as it still shows a very impressive scaling on heavy-duty multi-threaded jobs. Finally, as always at Oracle ISV Engineering, we are happy to help our ISV partners test their own applications on our platforms, so don't hesitate to contact us and let's see what the SPARC T4-based systems can do for your application! "As describe in this benchmark, Talend Enterprise Data Integration has overperformed on T4. I was generally happy to see that the T4 gave scaling opportunities for many scenarios like complex aggregations. Row by row insertion in Oracle DB is faster with more than 650,000 rows per seconds without using any bulk Oracle capabilities !" Cedric Carbone, Talend CTO.

    Read the article

  • [News] Microsoft annonce des offres sp?ciales autour de VS 2010

    Par la voix de Somasegar, Microsoft vient d'annoncer les promotions qui accompagneront la distribution de Visual Studio 2010 au d?tail. Tout acqu?reur d'une licence Visual Studio 2005 ou 2008 en version standard se verra offrir une version professionnelle de VS 2010 avec en plus un abonnement MSDN Essentials d'un an. "Today, we?re also unveiling an offer for customers who purchase Visual Studio Professional at retail. To help these developers fully realize the power and benefits of a MSDN subscription, I am announcing MSDN Essentials, a one-year trial MSDN subscription that will be included with every retail copy of Visual Studio Professional sold. ". Il est ?galement possible de commander VS 2010 en avance de phase ...

    Read the article

  • Apache Maven 3 annoncé pour le prochain trimestre : retro-compatiblité et performances accrues pour

    Mise à jour du 06/04/10 Apache Maven 3 annoncé pour le prochain trimestre Retro-compatiblité et performances accrues pour la future version du projet Maven 3 devrait arriver dans les deux à trois mois. C'est en tout cas ce que vient de déclarer Jason Van Zyl, son créateur et CTO de la société Sonatype, à la presse. Maven 3 sera la première grande étape majeure du projet Apache depuis 2005 et la sortie de Maven 2. Ce sera également la première sortie d'une technologie sous l'égide de Sonatype, qui propose un nouveau support commercial et un nouvel écosystème autour du projet. Avant la sortie de Maven 3, un cycle de versions beta...

    Read the article

  • SQL SERVER – Number-Crunching with SQL Server – Exceed the Functionality of Excel

    - by Pinal Dave
    Imagine this. Your users have developed an Excel spreadsheet that extracts data from your SQL Server database, manipulates that data through the use of Excel formulas and, possibly, some VBA code which is then used to calculate P&L, hedging requirements or even risk numbers. Management comes to you and tells you that they need to get rid of the spreadsheet and that the results of the spreadsheet calculations need to be persisted on the database. SQL Server has a very small set of functions for analyzing data. Excel has hundreds of functions for analyzing data, with many of them focused on specific financial and statistical calculations. Is it even remotely possible that you can use SQL Server to replace the complex calculations being done in a spreadsheet? Westclintech has developed a library of functions that match or exceed the functionality of Excel’s functions and contains many functions that are not available in EXCEL. Their XLeratorDB library of functions contains over 700 functions that can be incorporated into T-SQL statements. XLeratorDB takes advantage of the SQL CLR architecture introduced in SQL Server 2005. SQL CLR permits managed code to be compiled into the database and run alongside built-in SQL Server functions like COUNT or SUM. The Westclintech developers have taken advantage of this architecture to bring robust analytical functions to the database. In our hypothetical spreadsheet, let’s assume that our users are using the YIELD function and that the data are extracted from a table in our database called BONDS. Here’s what the spreadsheet might look like. We go to column G and see that it contains the following formula. Obviously, SQL Server does not offer a native YIELD function. However, with XLeratorDB we can replicate this calculation in SQL Server with the following statement: SELECT *, wct.YIELD(CAST(GETDATE() AS date),Maturity,Rate,Price,100,Frequency,Basis) AS YIELD FROM BONDS This produces the following result. This illustrates one of the best features about XLeratorDB; it is so easy to use. Since I knew that the spreadsheet was using the YIELD function I could use the same function with the same calling structure to do the calculation in SQL Server. I didn’t need to know anything at all about the mechanics of calculating the yield on a bond. It was pretty close to cut and paste. In fact, that’s one way to construct the SQL. Just copy the function call from the cell in the spreadsheet and paste it into SMS and change the cell references to column names. I built the SQL for this query by starting with this. SELECT * ,YIELD(TODAY(),B2,C2,D2,100,E2,F2) FROM BONDS I then changed the cell references to column names. SELECT * --,YIELD(TODAY(),B2,C2,D2,100,E2,F2) ,YIELD(TODAY(),Maturity,Rate,Price,100,Frequency,Basis) FROM BONDS Finally, I replicated the TODAY() function using GETDATE() and added the schema name to the function name. SELECT * --,YIELD(TODAY(),B2,C2,D2,100,E2,F2) --,YIELD(TODAY(),Maturity,Rate,Price,100,Frequency,Basis) ,wct.YIELD(GETDATE(),Maturity,Rate,Price,100,Frequency,Basis) FROM BONDS Then I am able to execute the statement returning the results seen above. The XLeratorDB libraries are heavy on financial, statistical, and mathematical functions. Where there is an analog to an Excel function, the XLeratorDB function uses the same naming conventions and calling structure as the Excel function, but there are also hundreds of additional functions for SQL Server that are not found in Excel. You can find the functions by opening Object Explorer in SQL Server Management Studio (SSMS) and expanding the Programmability folder under the database where the functions have been installed. The  Functions folder expands to show 3 sub-folders: Table-valued Functions; Scalar-valued functions, Aggregate Functions, and System Functions. You can expand any of the first three folders to see the XLeratorDB functions. Since the wct.YIELD function is a scalar function, we will open the Scalar-valued Functions folder, scroll down to the wct.YIELD function and and click the plus sign (+) to display the input parameters. The functions are also Intellisense-enabled, with the input parameters displayed directly in the query tab. The Westclintech website contains documentation for all the functions including examples that can be copied directly into a query window and executed. There are also more one hundred articles on the site which go into more detail about how some of the functions work and demonstrate some of the extensive business processes that can be done in SQL Server using XLeratorDB functions and some T-SQL. XLeratorDB is organized into libraries: finance, statistics; math; strings; engineering; and financial options. There is also a windowing library for SQL Server 2005, 2008, and 2012 which provides functions for calculating things like running and moving averages (which were introduced in SQL Server 2012), FIFO inventory calculations, financial ratios and more, without having to use triangular joins. To get started you can download the XLeratorDB 15-day free trial from the Westclintech web site. It is a fully-functioning, unrestricted version of the software. If you need more than 15 days to evaluate the software, you can simply download another 15-day free trial. XLeratorDB is an easy and cost-effective way to start adding sophisticated data analysis to your SQL Server database without having to know anything more than T-SQL. Get XLeratorDB Today and Now! Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Excel

    Read the article

  • Using Visual Studio as a Task-Focused IDE

    - by Jay Stevens
    Are there patterns or libraries or any official Microsoft SDK for using Visual Studio as a specifically Task-Focused UI? For example, both Revolution R (IDE for the R language) and SQL 2012 (and I think SQL 2008 and possibly 2005) use Visual Studio as the underlying IDE framework. Is there an officially supported SDK and/or examples/samples for doing this type of thing? I am building a language Parser for an existing language - whose only available IDE is INSANELY expensive - using Irony (and eventually will generate a Language Service as well). Any direct or indirect suggestions/answers are appreciated.

    Read the article

  • Is this overkill? Using MDX queries and cubes instead of SQL stored procedures

    - by Jason Holland
    I am new to Microsoft's SQL Server Analysis Services Cubes and MDX queries. Where I work we have a daily sales table in SQL Server 2005 that already contains an aggregate of sale information per store per day. At this time it contains only 164,000+ rows. We have a sales cube dedicated to this table that about 15 reports are based off of. Now, I should also note that we generate reports based on our own fiscal year criteria: a 13 period year (1 month equals 28 days etc.). Is this overkill? At what point is it justified to begin using SSAS Cubes/MDX over plain old SQL Server stored procedures? Since I have always been just using plain old SQL am I tragically late to the MDX party?

    Read the article

  • R2 and Idera Idera SQL Safe (Freeware Edition)

    - by DavidWimbush
    Good news: the Freeware edition of Idera SQL Safe works on R2. You might not care but I certainly do. Here's why:  In September last year I started using Idera SQL Safe (the Freeware Edition) to get backup compression on my SQL 2005 servers. It seemed like a good idea at the time - it was free and my backups ran much faster and took up much less disk space. I really thought I'd actually scored a free lunch. Until they discontinued the product. I was thinking about what to do when I heard that R2 Standard would include native backup compression so I've just been keeping my fingers crossed since then. So I installed R2 Developer on my laptop, installed SQL Safe and kicked off a restore with it. No problem. Phew! Now I won't have to do a special, non-compressed backup and restore when we migrate.

    Read the article

  • Best Practices Question on using an ObjectDataSource in asp.net

    - by Lill Lansey
    Asp.net, c#, vs2008, sqlserver 2005. I am filling a DataTable in the data access layer with data from a sqlserver stored procedure. Best Practices Question – Is it ok to pass the DataTable to the business layer and use the DataTable from the business layer for an ObjectDataSource in the presentation layer, or Should I transfer the data in the data table into a List and use the List for an ObjectDataSource in the presentation layer? If I should transfer the data to a List, should that be done in the data access layer or the business layer? Does it make a difference if the data needs to be edited before being displayed?

    Read the article

  • SqlMetal, Sql Server 2008 database, Table with HierachyID, dal cs file is created sometimes ?

    - by judek.mp
    I have 2 databases with a 2 tables with HierachyID fields. For one database I can get a dal cs file, for the other database I cannot get a dal cs file ? HBus is a database I can get the dal cs for, ... SqlMetal /server:.\SQLSERVER2008 /database:HBus /code:HBusDC.cs /views /functions /sprocs /namespace:HBusDC /context:HBusDataContext This kicks me out a file, ... which works, but excludes the HierarchyID field for the table and includes all other fields for that table. This is OK I do not mind. The above cmd line kicks out an warning but still produces a file, like so SqlMetal /server:.\SQLSERVER2008 /database:HBus /code:HBusDC.cs /views /functions /sprocs /namespace:HBusDC /context:HBusDataContext Microsoft (R) Database Mapping Generator 2008 version 1.00.30729 for Microsoft (R) .NET Framework version 3.5 Copyright (C) Microsoft Corporation. All rights reserved. Warning : SQM1021: Unable to extract column 'OrgNode' of Table 'dbo.HMsg' from SqlServer because the column's DbType is a user-defined type (UDT). Warning : SQM1021: Unable to extract column 'OrgNode' of Table 'dbo.vwHMsg' from SqlServer because the column's DbType is a user-defined type (UDT). HMsg is a table with a HierarchyID field. I have another database, Elf, almost the same thing but I get a warning and an Error when using sql metal and I do not get a dal cs file ... SqlMetal /server:.\SQLSERVER2008 /database:Elf /code:ElfDataContextDal.cs /views /functions /sprocs /namespace:HBusDC /context:HBusDataContext An error as well as the warning and the cs file fails to appear on my disc, ... :-( SqlMetal /server:.\SQLSERVER2008 /database:Elf /code:ElfDataContextDal.cs /views /functions /sprocs /namespace:HBusDC /context:HBusDataContext Microsoft (R) Database Mapping Generator 2008 version 1.00.30729 for Microsoft (R) .NET Framework version 3.5 Copyright (C) Microsoft Corporation. All rights reserved. Warning : SQM1021: Unable to extract column 'OrgNode' of Table 'dbo.EntityLink' from SqlServer because the column's DbType is a user-defined type (UDT). Error : Requested value 'ELF.SYS.HIERARCHYID' was not found. The fields are declared the same way in Elf db OrgNode [HierarchyID] null , in HBus db ... OrgNode [HierarchyID] null , Both databases are in the same instance of sql server 2008, so the HierarchyID is an inbuilt type, neither db has HierarchyID udt ,... cheers in advance for any replies ...

    Read the article

  • Cooperator Framework

    - by csharp-source.net
    Cooperator Framework is a base class library for high performance Object Relational Mapping (ORM), and a code generation tool that aids agile application development for Microsoft .Net Framework 2.0/3.0. The main features are: * Use business entities. * Full typed Model (Data Layer and Entities) * Maintain persistence across the layers by passing specific types( .net 2.0/3.0 generics) * Business objects can bind to controls in Windows Forms and Web Forms taking advantage of data binding of Visual Studio 2005. * Supports any Primary Key defined on tables, with no need to modify it or to create a unique field. * Uses stored procedures for data access. * Supports concurrency. * Generates code both for stored procedures and projects in C# or Visual Basic. * Maintains the model in a repository, which can be modified in any stage of the development cycle, regenerating the model on demand.

    Read the article

  • Java EE at JavaOne - A Few Picks from a Very Rich Line-up

    - by Janice J. Heiss
    A rich and diverse set of sessions cast a spotlight on Java EE at this year’s JavaOne, ranging from the popular Web Framework Smackdown, to Java EE 6 and Spring, to sessions exploring Java EE 7, and one on the implications of HTML5. Some of the world’s best EE architects and developers will be sharing their insight and expertise. If only I could be at ten places at once!BOF4149 - Web Framework Smackdown 2012    Markus Eisele - Principal IT Architect, msg systems ag    Graeme Rocher - Senior Staff Engineer, VMware    James Ward - Developer Evangelist, Heroku    Ed Burns - Consulting Member of Technical Staff, Oracle    Santiago Pericasgeertsen - Software Engineer, Oracle* Monday, Oct 1, 8:30 PM - 9:15 PM - Parc 55 - Cyril Magnin II/III Much has changed since the first Web framework smackdown, at JavaOne 2005. Or has it? The 2012 edition of this popular panel discussion surveys the current landscape of Web UI frameworks for the Java platform. The 2005 edition featured JSF, Webwork, Struts, Tapestry, and Wicket. The 2012 edition features representatives of the current crop of frameworks, with a special emphasis on frameworks that leverage HTML5 and thin-server architecture. Java Champion Markus Eisele leads the lively discussion with panelists James Ward (Play), Graeme Rocher (Grails), Edward Burns (JSF) and Santiago Pericasgeertsen (Avatar).CON6430 - Java EE and Spring Framework Panel Discussion    Richard Hightower - Developer, InfoQ    Bert Ertman - Fellow, Luminis    Gordon Dickens - Technical Architect, IT101, Inc.    Chris Beams - Senior Technical Staff, VMware    Arun Gupta - Technology Evangelist, Oracle* Tuesday, Oct 2, 10:00 AM - 11:00 AM - Parc 55 - Cyril Magnin II/III In the age of Java EE 6 and Spring 3, enterprise Java developers have many architectural choices, including Java EE 6 and Spring, but which one is right for your project? Many of us have heard the debate and seen the flame wars—it’s a topic with passionate community members, and it’s a vibrant debate. If you are looking for some level-headed discussion, grounded in real experience, by developers who have tried both, then come join this discussion. InfoQ’s Java editors moderate the discussion, and they are joined by independent consultants and representatives from both Java EE and VMWare/SpringSource.BOF4213 - Meet the Java EE 7 Specification Leads   Linda Demichiel - Consulting Member of Technical Staff, Oracle   Bill Shannon - Architect, Oracle* Tuesday, Oct 2, 5:30 PM - 6:15 PM – Parc 55 - Cyril Magnin II/III This is your chance to meet face-to-face with the engineers who are developing the next version of the Java EE platform. In this session, the specification leads for the leading technologies that are part of the Java EE 7 platform discuss new and upcoming features and answer your questions. Come prepared with your questions, your feedback, and your suggestions for new features in Java EE 7 and beyond.CON10656 - JavaEE.Next(): Java EE 7, 8, and Beyond    Ian Robinson - IBM Distinguished Engineer, IBM    Mark Little - JBoss CTO, NA    Scott Ferguson - Developer, Caucho Technology    Cameron Purdy - VP Development, Oracle*Wednesday, Oct 3, 4:30 PM - 5:30 PM - Parc 55 - Cyril Magnin II/IIIIn this session, hear from a distinguished panel of industry and open source luminaries regarding where they believe the Java EE community is headed, starting with Java EE 7. The focus of Java EE 7 and 8 is mostly on the cloud, specifically aiming to bring platform as a service (PaaS) providers and application developers together so that portable applications can be deployed on any cloud infrastructure and reap all its benefits in terms of scalability, elasticity, multitenancy, and so on. Most importantly, Java EE will leverage the modularization work in the underlying Java SE platform. Java EE will, of course, also update itself for trends such as HTML5, caching, NoSQL, ployglot programming, map/reduce, JSON, REST, and improvements to existing core APIs.CON7001 - HTML5 WebSocket and Java    Danny Coward - Java, Oracle*Wednesday, Oct 3, 4:30 PM - 5:30 PM - Parc 55 - Cyril Magnin IThe family of HTML5 technologies has pushed the pendulum away from rich client technologies and toward ever-more-capable Web clients running on today’s browsers. In particular, WebSocket brings new opportunities for efficient peer-to-peer communication, providing the basis for a new generation of interactive and “live” Web applications. This session examines the efforts under way to support WebSocket in the Java programming model, from its base-level integration in the Java Servlet and Java EE containers to a new, easy-to-use API and toolset that are destined to become part of the standard Java platform.

    Read the article

  • Upgrading SSIS Custom Components for SQL Server 2012

    Having finally got around to upgrading my custom components to SQL Server 2012, I thought I’d share some notes on the process. One of the goals was minimal duplication, so the same code files are used to build the 2008 and 2012 components, I just have a separate project file. The high level steps are listed below, followed by some more details. Create a 2012 copy of the project file Upgrade project, just open the new project file is VS2010 Change target framework to .NET 4.0 Set conditional compilation symbol for DENALI Change any conditional code, including assembly version and UI type name Edit project file to change referenced assemblies for 2012 Change target framework to .NET 4.0 Open the project properties. On the Applications page, change the Target framework to .NET Framework 4. Set conditional compilation symbol for DENALI Re-open the project properties. On the Build tab, first change the Configuration to All Configurations, then set a Conditional compilation symbol of DENALI. Change any conditional code, including assembly version and UI type name The value doesn’t have to be DENALI, it can actually be anything you like, that is just what I use. It is how I control sections of code that vary between versions. There were several API changes between 2005 and 2008, as well as interface name changes. Whilst we don’t have the same issues between 2008 and 2012, I still have some sections of code that do change such as the assembly attributes. #if DENALI [assembly: AssemblyDescription("Data Generator Source for SQL Server Integration Services 2012")] [assembly: AssemblyCopyright("Copyright © 2012 Konesans Ltd")] [assembly: AssemblyVersion("3.0.0.0")] #else [assembly: AssemblyDescription("Data Generator Source for SQL Server Integration Services 2008")] [assembly: AssemblyCopyright("Copyright © 2008 Konesans Ltd")] [assembly: AssemblyVersion("2.0.0.0")] #endif The Visual Studio editor automatically formats the code based on the current compilation symbols, hence in this case the 2008 code is grey to indicate it is disabled. As you can see in the previous example I have distinct assembly version attributes, ensuring I can run both 2008 and 2012 versions of my component side by side. For custom components with a user interface, be sure to update the UITypeName property of the DtsTask or DtsPipelineComponent attributes. As above I use the conditional compilation symbol to control the code. #if DENALI [DtsTask ( DisplayName = "File Watcher Task", Description = "File Watcher Task", IconResource = "Konesans.Dts.Tasks.FileWatcherTask.FileWatcherTask.ico", UITypeName = "Konesans.Dts.Tasks.FileWatcherTask.FileWatcherTaskUI,Konesans.Dts.Tasks.FileWatcherTask,Version=3.0.0.0,Culture=Neutral,PublicKeyToken=b2ab4a111192992b", TaskContact = "File Watcher Task; Konesans Ltd; Copyright © 2012 Konesans Ltd; http://www.konesans.com" )] #else [DtsTask ( DisplayName = "File Watcher Task", Description = "File Watcher Task", IconResource = "Konesans.Dts.Tasks.FileWatcherTask.FileWatcherTask.ico", UITypeName = "Konesans.Dts.Tasks.FileWatcherTask.FileWatcherTaskUI,Konesans.Dts.Tasks.FileWatcherTask,Version=2.0.0.0,Culture=Neutral,PublicKeyToken=b2ab4a111192992b", TaskContact = "File Watcher Task; Konesans Ltd; Copyright © 2004-2008 Konesans Ltd; http://www.konesans.com" )] #endif public sealed class FileWatcherTask: Task, IDTSComponentPersist, IDTSBreakpointSite, IDTSSuspend { // .. code goes on... } Shown below is another example I found that needed changing. I borrow one of the MS editors, and use it against a custom property, but need to ensure I reference the correct version of the MS controls assembly. This section of code is actually shared between the 2005, 2008 and 2012 versions of my component hence it has test for both DENALI and KATMAI symbols. #if DENALI const string multiLineUI = "Microsoft.DataTransformationServices.Controls.ModalMultilineStringEditor, Microsoft.DataTransformationServices.Controls, Version=11.0.00.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; #elif KATMAI const string multiLineUI = "Microsoft.DataTransformationServices.Controls.ModalMultilineStringEditor, Microsoft.DataTransformationServices.Controls, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; #else const string multiLineUI = "Microsoft.DataTransformationServices.Controls.ModalMultilineStringEditor, Microsoft.DataTransformationServices.Controls, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; #endif // Create Match Expression parameter IDTSCustomPropertyCollection100 propertyCollection = outputColumn.CustomPropertyCollection; IDTSCustomProperty100 property = propertyCollection.New(); property = propertyCollection.New(); property.Name = MatchParams.Name; property.Description = MatchParams.Description; property.TypeConverter = typeof(MultilineStringConverter).AssemblyQualifiedName; property.UITypeEditor = multiLineUI; property.Value = MatchParams.DefaultValue; Edit project file to change referenced assemblies for 2012 We now need to edit the project file itself. Open the MyComponente2012.cproj  in you favourite text editor, and then perform a couple of find and replaces as listed below: Find Replace Comment Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Change the assembly references version from SQL Server 2008 to SQL Server 2012. Microsoft SQL Server\100\ Microsoft SQL Server\110\ Change any assembly reference hint path locations from from SQL Server 2008 to SQL Server 2012. If you use any Build Events during development, such as copying the component assembly to the DTS folder, or calling GACUTIL to install it into the GAC, you can also change these now. An example of my new post-build event for a pipeline component is shown below, which uses the .NET 4.0 path for GACUTIL. It also uses the 110 folder location, instead of 100 for SQL Server 2008, but that was covered the the previous find and replace. "C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\NETFX 4.0 Tools\gacutil.exe" /if "$(TargetPath)" copy "$(TargetPath)" "%ProgramFiles%\Microsoft SQL Server\110\DTS\PipelineComponents" /Y

    Read the article

  • [JDBCExceptionReporter] SQL Warning in Hibernate

    - by adisembiring
    Hi all, I'm using Hibernate 3.2.1 and database SQLServer2000 while I'm try to insert some data using my dao, some warning occurred like this: java.sql.SQLWarning: [Microsoft][SQLServer 2000 Driver for JDBC]Database changed to BTN_SPP_DB at com.microsoft.jdbc.base.BaseWarnings.createSQLWarning(Unknown Source) at com.microsoft.jdbc.base.BaseWarnings.get(Unknown Source) at com.microsoft.jdbc.base.BaseConnection.getWarnings(Unknown Source) at org.hibernate.util.JDBCExceptionReporter.logAndClearWarnings(JDBCExceptionReporter.java:22) at org.hibernate.jdbc.ConnectionManager.closeConnection(ConnectionManager.java:443) at org.hibernate.jdbc.ConnectionManager.aggressiveRelease(ConnectionManager.java:400) at org.hibernate.jdbc.ConnectionManager.afterTransaction(ConnectionManager.java:287) at org.hibernate.jdbc.JDBCContext.afterTransactionCompletion(JDBCContext.java:221) at org.hibernate.transaction.JDBCTransaction.commit(JDBCTransaction.java:119) at co.id.hanoman.btnmw.spp.dao.TagihanDao.save(TagihanDao.java:43) at co.id.hanoman.btnmw.spp.dao.TagihanDao.save(TagihanDao.java:1) at co.id.hanoman.btnmw.spp.dao.test.TagihanDaoTest.testSave(TagihanDaoTest.java:81) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.junit.internal.runners.TestMethodRunner.executeMethodBody(TestMethodRunner.java:99) at org.junit.internal.runners.TestMethodRunner.runUnprotected(TestMethodRunner.java:81) at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:34) at org.junit.internal.runners.TestMethodRunner.runMethod(TestMethodRunner.java:75) at org.junit.internal.runners.TestMethodRunner.run(TestMethodRunner.java:45) at org.junit.internal.runners.TestClassMethodsRunner.invokeTestMethod(TestClassMethodsRunner.java:66) at org.junit.internal.runners.TestClassMethodsRunner.run(TestClassMethodsRunner.java:35) at org.junit.internal.runners.TestClassRunner$1.runUnprotected(TestClassRunner.java:42) at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:34) at org.junit.internal.runners.TestClassRunner.run(TestClassRunner.java:52) at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:45) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196) my hibernate initialization is: 2010-04-26 22:54:05,203 INFO [Version] Hibernate Annotations 3.3.0.GA 2010-04-26 22:54:05,234 INFO [Environment] Hibernate 3.2.1 2010-04-26 22:54:05,234 INFO [Environment] hibernate.properties not found 2010-04-26 22:54:05,234 INFO [Environment] Bytecode provider name : cglib 2010-04-26 22:54:05,234 INFO [Environment] using JDK 1.4 java.sql.Timestamp handling 2010-04-26 22:54:05,343 INFO [Configuration] configuring from resource: /hibernate.cfg.xml 2010-04-26 22:54:05,343 INFO [Configuration] Configuration resource: /hibernate.cfg.xml 2010-04-26 22:54:05,406 DEBUG [DTDEntityResolver] trying to resolve system-id [http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd] 2010-04-26 22:54:05,406 DEBUG [DTDEntityResolver] recognized hibernate namespace; attempting to resolve on classpath under org/hibernate/ 2010-04-26 22:54:05,406 DEBUG [DTDEntityResolver] located [http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd] in classpath 2010-04-26 22:54:05,453 DEBUG [Configuration] hibernate.dialect=org.hibernate.dialect.SQLServerDialect 2010-04-26 22:54:05,453 DEBUG [Configuration] hibernate.connection.driver_class=com.microsoft.jdbc.sqlserver.SQLServerDriver 2010-04-26 22:54:05,453 DEBUG [Configuration] hibernate.connection.url=jdbc:microsoft:sqlserver://12.56.11.65:1433;databaseName=BTN_SPP_DB 2010-04-26 22:54:05,453 DEBUG [Configuration] hibernate.connection.username=spp 2010-04-26 22:54:05,453 DEBUG [Configuration] hibernate.connection.password=spp

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >