Search Results

Search found 6695 results on 268 pages for 'news analysis'.

Page 37/268 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • Oracle Buys Compendium - Adds Leading Content Marketing Platform to Oracle Eloqua Marketing Cloud

    - by Richard Lefebvre
    News Facts Oracle today announced that it has acquired Compendium, a cloud-based content marketing provider that helps companies plan, produce and deliver engaging content across multiple channels throughout their customers’ lifecycle. Compendium’s data-driven approach aligns relevant content with customer data and profiles to help companies more effectively attract prospects, engage buyers, accelerate conversion of prospects to opportunities, increase adoption, and drive revenue growth. Compendium’s innovative solution complements Oracle’s industry leading Eloqua Marketing Cloud which is a part of Oracle’s comprehensive Customer Experience solution. The combination of Oracle Eloqua Marketing Cloud with Compendium is expected to enable modern marketers to align persona-based content to customers’ digital body language to increase “top-of-funnel” customer engagement, improve the quality of sales leads, realize the highest return on their marketing investment, and increase customer loyalty. More information on this announcement can be found at http://www.oracle.com/compendium. Supporting Quotes “As customers increasingly access information through online and mobile channels, the buying process is shifting from sales-driven to marketing-driven. Now, more than ever, marketers are challenged to deliver relevant and engaging content across multiple channels and throughout the customer lifecycle,” said Thomas Kurian, Executive Vice President, Oracle Development. “By adding Compendium’s content marketing platform to Oracle Eloqua Marketing Cloud, customers will be able to capture more prospects, improve the customer experience and drive top line revenue.” “Oracle Eloqua Marketing Cloud is uniquely positioned to capture a prospect’s digital body language to help companies know each buyer’s demographics, behaviors and influencers,” said Chris Baggott, Compendium CEO. “By combining this buyer profile with Compendium’s data-driven content marketing platform, marketers will be able to deliver the right content, to the right individual across the right channel at the right time. We are very excited to now be a part of the industry’s most complete marketing cloud solution, giving us a global stage to deliver innovative content marketing solutions.” Supporting Resources About Oracle and Compendium General Presentation Customer and Partner Letter FAQ

    Read the article

  • Immer up to date! Die Newsletter und Magazine von Oracle

    - by A&C Redaktion
    Man muss nicht alles wissen, aber man muss wissen, wo alles steht! Deswegen wollen wir Ihnen heute einen kleinen Überblick über die Newsletter und Publikationen von Oracle geben. Da wären zum einen die regelmäßig erscheinenden Magazine: Das Oracle Magazine in englischer Sprache kommt alle zwei Monate heraus. Schwerpunktthemen der letzten Ausgaben waren beispielsweise Apps für Oracle Exadata und der Dauerbrenner IT Security. Das Magazin ist komplett online abrufbar und kann auch abonniert werden. Beim Profit Magazine, ebenfalls englischsprachig, macht der Untertitel bereits deutlich, worum es geht: „Technology Powered. Business Driven“ - die Schnittstelle zwischen Technik und Unternehmergeist also. Gerade für aufstrebende Partner sind hier wertvolle Informationen aufbereitet, die Ihr Business voranbringen. Auch das Java Magazine trägt sein Thema bereits im Titel. Es erscheint zweimonatlich, nur in digitaler Form und auf Englisch. Hier geht’s zur Subscription. Neben dem Oracle Partner Blog, den Sie ja kennen, gibt es einen mehrsprachigen Blog mit EMEA-Partner News für den unternehmerischen Blick nach Europa. Wer Interesse an zielgerichteten Fachinformationen hat, dem bietet sich noch eine weitere Möglichkeit: Über Ihren Oracle.com User-Account können Partner sich je nach Interesse informative Newsletter zu allen Themen zusammenstellen. So können Sie ganz individuell bestimmen, ob und in welcher Frequenz Sie Info-Mails von Oracle erhalten möchte. Auch die deutschsprachigen Veranstaltungs-Infos werden hierüber gesteuert.

    Read the article

  • SQL SERVER – Download PSSDIAG Data Collection Utility

    - by pinaldave
    During an early career of mine as a database consultant – when I was dealing with SQL Server 2000, I often needed to collect various data related to SQL Server. My favorite tool to collect the data is PSSDIAG tool. It is a general purpose diagnostic collection utility that Microsoft Product Support Services uses to collect various logs and data files. It collects Performance Monitor logs, SQL Profiler traces, SQL Server blocking script output, Windows Event Logs, and SQLDIAG output. The data collected can be used by SQL Nexus tool which help you troubleshoot SQL Server performance problems. PSSDIAG is a wrapper around other data collection APIs and utilities, the performance impact of running PSSDIAG is generally equal to the impact of the traces that PSSDIAG has been configured to capture. If you are using SQL Server 2000 – you need to seriously consider to upgrading it to SQL Server 2012. Here is a PSSDIAG Data Collection Utility updated in August 2012. My friend and SQL Server Expert Amit Benerjee have written an excellent article on this subject, I encourage all of you to read the same. Note: For SQL Server 2012 there is SQLDiag. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • SQL SERVER – Copy Database – SQL in Sixty Seconds #067

    - by Pinal Dave
    There are multiple reasons why a user may want to make a copy of the database. Sometimes a user wants to copy the database to the same server and sometime wants to copy the database on a different server. The important point is that DBA and Developer may want copies of their database for various purposes. I copy my database for backup purpose. However, when we hear coping database – the very first thought which comes to our mind is – Backup and Restore or Attach and Detach. Both of these processes have their own advantage and disadvantages. The matter of the fact, those methods is much efficient and recommended methods. However, if you just want to copy your database as it is and do not want to go for advanced feature. You can just use the copy feature of the SQL Server. Here are the settings, which you can use to copy the database. SQL in Sixty Seconds Video I have attempted to explain the same subject in simple words over in following video. Action Item Here are the blog posts I have previously written on the subject of SA password. You can read it over here: Copy Database from Instance to Another Instance – Copy Paste in SQL Server Copy Database With Data – Generate T-SQL For Inserting Data From One Table to Another Table Copy Data from One Table to Another Table – SQL in Sixty Seconds #031 – Video Generate Script for Schema and Data – SQL in Sixty Seconds #021 – Video You can subscribe to my YouTube Channel for frequent updates. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLAuthority News, T SQL, Video

    Read the article

  • BI&EPM in Focus - November 2011

    - by Mike.Hallett(at)Oracle-BI&EPM
    Enterprise Performance Management A Thing of Beauty, by Alison WeissAvon’s enterprise performance management system delivers accurate information and critical insight to managers at every level of the organization Oracle Crystal Ball Helps Managers Guard Against Volatility, by Alison Weiss The Insight Game, by Aaron LazenbyEnterprise performance management can deliver insights crucial to navigating the volatility of the global economy—and that’s no game of checkers. KPI vs. the Bottom Line, by Edward RoskeFor managers, is tracking the key metrics for their departments enough to ensure success for the entire business? The CEO for Oracle partner interRel shares his opinion. Deep Integration, by Aaron LazenbyThe synthesis of Oracle Hyperion applications and core Oracle technologies can deliver deep benefits to analytics-driven businesses. Oracle Crystal Ball. Oracle's #1 Solution for Risk Management Follow EPM Documentation at Hyperion EPM Info for news about EPM documentation releases and updates (twitter | facebook | Linkedin) Whitepaper: Integrating XBRL Into Your Financial Reporting Process Oracle Hyperion Disclosure Management Customer Story: StealthGas Inc. Saves 12 Accountant Days Yearly, Validates XBRL-Compliant Financial Filing Data in One Day Sherwin-Williams Argentina I.C.S.A. Accelerates Budget Preparation Process by 75% BBDO Germany GmbH Consolidates Financial and Planning Processes for More Than 50 Agencies StealthGas Inc. Saves 12 Accountant Days Yearly, Validates XBRL-Compliant Financial Filing Data in One Day Business Intelligence Webcast Replay: Oracle Data Mining & BI EE - Predictive Analytics (Part 2) Innovation Award Winners - BI/EPM: HealthSouth, State of MD, Clorox Company, Telenor and Dunkin Brands Leeds Teaching Hospitals National Health Service Trust Builds Budget Reports Six Times Faster, Achieves 100% ROI in 12 Months with Oracle Business Intelligence Home Credit Group Consolidates Reporting and Saves Time across All Business Units w/ Oracle Essbase & OBIEE Autoglass Improves Business Visibility and Services to Customers and Partners with Oracle Business Intelligence Events Download Oracle OpenWorld Oct 2011 Presentations select Middleware - BI or Applications - Hyperion Oracle Business Analytics Summits:learn about the latest trends, best practices, and innovations in business intelligence, analytics applications, and data warehousing Webcast Nov 15 9am PST: Running the Last Mile, Beyond Financial Consolidations - Streamlining the Close and Addressing the SEC's XBRL Mandate Webcast Dec 13 1pm PST: Defining Your Mobile BI Strategy (BICG) New Training Available: Oracle BI Publisher 11g R1: Fundamentals Webcast Replay: How to Expand the Usage of Analytics in your Organization while Driving Down IT Spend Webcast Replay: Real-Time Decisions (RTD) Updated Use Cases for Ecommerce Personalization in Financial Services & Retail

    Read the article

  • How to show or direct a business analyst to a data modelling subject?

    - by AaronLS
    Our business analysts pushed hard to collect data through a spreadsheet. I am the programmer responsible for importing that data. Usually when they push hard for something like this, I never know how well it will work out until a few weeks later when I have time assigned to work on the task of programming the import of the data. I have tried to do as much as possible along the way, named ranges, data validations, etc. But I usually don't have time to take a detailed look at all the data and compare to the destination in the database to determine how well it matches up. A lot of times there will be maybe a little table of items that somehow I have to relate to something else in the database, but there are not natural or business keys present that would allow me to do so. Make the best of this, trying to write something that can compare strings and make a best guess at it and then go through the effort of creating interfaces for a user to match the imported data to the destination. I feel like if the business analyst was actually creating a data model, they would be forced to think about these relationships, and have an appreciation for the need of natural or business keys to be part of the spreadsheet for the purposes of smoothly importing the data. The closest they come to business analysis is a big flat list of fields, and that would be fine if it were like any other data dictionary and include data types+relationships, but it isn't. They are just a bunch of names. No indication of what type of data they might hold, and it is up to me to guess. When I have pushed for more detail, they say that it is just busy work. How can I explain the importance of data modelling? How can I tell them what it is and how to do it? It feels impossible, because they don't have an appreciation for its importance. They do however, usually have an interest in helping out in whatever way they can, it's just this in particular has never gotten a motivated response.

    Read the article

  • Tender vs. Requirements vs. Solution Design

    - by Tom Tom
    Conventionally, which of the above documents is deemed to hold the most weight when it comes to system acceptance? I recently had a conversation along these lines: It was argued that the initial requirements / tender documentation should be used to determine system acceptance. It was said that the solution design only serves to describe the way in which the system will solve the problem, not the problem it will solve. Furthermore, it was argued that if requirements are missed during solution design, the requirements should be referenced during system acceptance and that if any requirements were missed then the original tender should be referenced. Conversely, I suggested that - while requirements may be based on the original tender - they supersede it once agreed with the stakeholders. Furthermore, during solution design, analysis is performed to address and refine these initial requirements, translating them into a system capable of meeting the actual requirements. Once signed off by the relevant users, this solution design should absolutely represent the requirements (by virtue of the fact that it's designed upon them) but actually supersedes them as the basis for system acceptance. Is one of the above arguments more valid than the other?

    Read the article

  • How to show or direct a business analyst to do data modelling?

    - by AaronLS
    Our business analysts pushed hard to collect data through a spreadsheet. I am the programmer responsible for importing that data. Usually when they push hard for something like this, I never know how well it will work out until a few weeks later when I have time assigned to work on the task of programming the import of the data. I have tried to do as much as possible along the way, named ranges, data validations, etc. But I usually don't have time to take a detailed look at all the data and compare to the destination in the database to determine how well it matches up. A lot of times there will be maybe a little table of items that somehow I have to relate to something else in the database, but there are not natural or business keys present that would allow me to do so. Make the best of this, trying to write something that can compare strings and make a best guess at it and then go through the effort of creating interfaces for a user to match the imported data to the destination. I feel like if the business analyst was actually creating a data model, they would be forced to think about these relationships, and have an appreciation for the need of natural or business keys to be part of the spreadsheet for the purposes of smoothly importing the data. The closest they come to business analysis is a big flat list of fields, and that would be fine if it were like any other data dictionary and include data types+relationships, but it isn't. They are just a bunch of names. No indication of what type of data they might hold, and it is up to me to guess. When I have pushed for more detail, they say that it is just busy work. How can I explain the importance of data modelling? How can I tell them what it is and how to do it? It feels impossible, because they don't have an appreciation for its importance. They do however, usually have an interest in helping out in whatever way they can, it's just this in particular has never gotten a motivated response.

    Read the article

  • How many copies are needed to enlarge an array?

    - by user10326
    I am reading an analysis on dynamic arrays (from the Skiena's algorithm manual). I.e. when we have an array structure and each time we are out of space we allocate a new array of double the size of the original. It describes the waste that occurs when the array has to be resized. It says that (n/2)+1 through n will be moved at most once or not at all. This is clear. Then by describing that half the elements move once, a quarter of the elements twice, and so on, the total number of movements M is given by: This seems to me that it adds more copies than actually happen. E.g. if we have the following: array of 1 element +--+ |a | +--+ double the array (2 elements) +--++--+ |a ||b | +--++--+ double the array (4 elements) +--++--++--++--+ |a ||b ||c ||c | +--++--++--++--+ double the array (8 elements) +--++--++--++--++--++--++--++--+ |a ||b ||c ||c ||x ||x ||x ||x | +--++--++--++--++--++--++--++--+ double the array (16 elements) +--++--++--++--++--++--++--++--++--++--++--++--++--++--++--++--+ |a ||b ||c ||c ||x ||x ||x ||x || || || || || || || || | +--++--++--++--++--++--++--++--++--++--++--++--++--++--++--++--+ We have the x element copied 4 times, c element copied 4 times, b element copied 4 times and a element copied 5 times so total is 4+4+4+5 = 17 copies/movements. But according to formula we should have 1*(16/2)+2*(16/4)+3*(16/8)+4*(16/16)= 8+8+6+4=26 copies of elements for the enlargement of the array to 16 elements. Is this some mistake or the aim of the formula is to provide a rough upper limit approximation? Or am I missunderstanding something here?

    Read the article

  • HTG Explains: What Is RSS and How Can I Benefit From Using It?

    - by Jason Fitzpatrick
    If you’re trying to keep up with news and content on multiple web sites, you’re faced with the never ending task of visiting those sites to check for new content. Read on to learn about RSS and how it can deliver the content right to your digital doorstep. In many ways, content on the internet is beautifully linked together and accessible, but despite the interconnectivity of it all we still frequently find ourselves visiting this site, then that site, then another site, all in an effort to check for updates and get the content we want. That’s not particular efficient and there’s a much better way to go about it. Imagine if you will a simple hypothetical situation. You’re a fan of a web comic, a few tech sites, an infrequently updated but excellent blog about an obscure music genre you’re a fan of, and you like to keep an eye on announcements from your favorite video game vendor. If you rely on manually visiting all those sites—and, let’s be honest, our hypothetical example has a scant half-dozen sites while the average person would have many, many, more—then you’re either going to be wasting a lot of time checking the sites every day for new content or you’re going to be missing out on content as you either forget to visit the sites or find the content after it’s not as useful or relevant to you. RSS can break you free from that cycle of either over-checking or under-finding content by delivering the content to you as it is published. Let’s take a look at what RSS is how it can help. HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • Read Oracle Certification Program's December 2012 E-Magazine now!

    - by Harold Green
    Hello Everyone, The big news in this edition of our Oracle Certification E-Magazine is related to a change in the way that exam results are provided at the end of the test (using our CertView tool). This significant process change for the Oracle program sets the stage for tighter integration of candidate information and exam/certifcation results. Additionally, it helps give every certification holder access to important tools available in CertView. The new process was implemented in November and so far it is going very well. Much of the success of this new initiative is due to you (following the new process)! We are continuing to work to expand the functionality of CertView to better help you use your certification as a tool to help improve your career. Also in this issue of the E-Magazine, we are announcing several new offerings. We have a new SQL Tuning certification as well as a new Exam Preparation Seminar. We have continued to release new Exam Preparation Seminars and Exam Preparation Seminar Value Packages and we are receiving good feedback. We hope that you will consider employing one of these seminars to help you prepare for your next certification exam. They are now even available on iPad! READ THE DECEMBER 2012 EDITION HERE Thank you and good luck! Paul Sorensen Sr. Director, Global Certification Programs

    Read the article

  • You are invited! Quarterly Partner Sales Update Roadshow

    - by Giuseppe Facchetti
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Starting July this year, Oracle’s A&C, Partner Enablement and Hardware Teams will be organizing quarterly face-to-face sales training events to keep you up to date with Hardware sales news, latest products and solutions announcements, competitive positioning, sales tools -- all of this with an Oracle-on-Oracle approach.  We are pleased to invite you to attend the first Oracle EMEA Hardware Quarterly Partner Sales Update Roadshow running in 10 different cities across EMEA. The 3 hour, free of charge sales session will run in the afternoon in various locations.  Learn to Articulate the Oracle Hardware Business value proposition to your customers. Explain Oracle Hardware positioning versus the competition. Understand Oracle Hardware as best platform to run the complete Oracle-on-Oracle stack from Application to Disk Find all the details and register here! /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Oracle Buys BigMachines - Adds Leading Configure, Price and Quote (CPQ) Cloud to the Oracle Cloud to Enable Smarter Selling

    - by Richard Lefebvre
    News Facts Oracle today announced that it has entered into an agreement to acquire BigMachines, a leading cloud-based Configure, Price and Quote (CPQ) solution provider. BigMachines’ CPQ Cloud accelerates the conversion of sales opportunities into revenue by automating the sales order process with guided selling, dynamic pricing, and an easy-to-use workflow approval process, accessible anywhere, on any device. Companies that use sales automation technology often rely on manual, cumbersome and disconnected processes to convert opportunities into orders. This creates errors, adds costs, delays revenue, and degrades the customer experience. BigMachines’ CPQ cloud extends sales automation to include the creation of an optimal quote, which enables sales personnel to easily configure and price complex products, select the best options, promotions and deal terms, and include up sell and renewals, all using automated workflows. In combination with Oracle’s enterprise-grade cloud solutions, including Marketing, Sales, Social, Commerce and Service Clouds, Oracle and BigMachines will create an end-to-end smarter selling cloud solution so sales personnel are more productive, customers are more satisfied, and companies grow revenue faster. More information on this announcement can be found at http://www.oracle.com/bigmachines Supporting Quotes “The fundamental goals of smarter selling are to provide sales teams with the information, access, and insights they need to maximize revenue opportunities and execute on all phases of the sales cycle,” said Thomas Kurian, Executive Vice President, Oracle Development. “By adding BigMachines’ CPQ Cloud to the Oracle Cloud, companies will be able to drive more revenue and increase customer satisfaction with a seamlessly integrated process across marketing and sales, pricing and quoting, and fulfillment and service.” “BigMachines has developed leading CPQ solutions that serve companies of all sizes across multiple industries,” said David Bonnette, BigMachines’ CEO. “Together with Oracle, we expect to provide a complete cloud solution to manage sales processes and deliver exceptional customer experiences.” Supporting Resources About Oracle and BigMachines General Presentation Customer and Partner Letter FAQ

    Read the article

  • C# performance analysis- how to count CPU cycles?

    - by Lirik
    Is this a valid way to do performance analysis? I want to get nanosecond accuracy and determine the performance of typecasting: class PerformanceTest { static double last = 0.0; static List<object> numericGenericData = new List<object>(); static List<double> numericTypedData = new List<double>(); static void Main(string[] args) { double totalWithCasting = 0.0; double totalWithoutCasting = 0.0; for (double d = 0.0; d < 1000000.0; ++d) { numericGenericData.Add(d); numericTypedData.Add(d); } Stopwatch stopwatch = new Stopwatch(); for (int i = 0; i < 10; ++i) { stopwatch.Start(); testWithTypecasting(); stopwatch.Stop(); totalWithCasting += stopwatch.ElapsedTicks; stopwatch.Start(); testWithoutTypeCasting(); stopwatch.Stop(); totalWithoutCasting += stopwatch.ElapsedTicks; } Console.WriteLine("Avg with typecasting = {0}", (totalWithCasting/10)); Console.WriteLine("Avg without typecasting = {0}", (totalWithoutCasting/10)); Console.ReadKey(); } static void testWithTypecasting() { foreach (object o in numericGenericData) { last = ((double)o*(double)o)/200; } } static void testWithoutTypeCasting() { foreach (double d in numericTypedData) { last = (d * d)/200; } } } The output is: Avg with typecasting = 468872.3 Avg without typecasting = 501157.9 I'm a little suspicious... it looks like there is nearly no impact on the performance. Is casting really that cheap?

    Read the article

  • Sentiment analysis for twitter in python

    - by Ran
    I'm looking for an open source implementation, preferably in python, of Textual Sentiment Analysis (http://en.wikipedia.org/wiki/Sentiment_analysis). Is anyone familiar with such open source implementation I can use? I'm writing an application that searches twitter for some search term, say "youtube", and counts "happy" tweets vs. "sad" tweets. I'm using Google's appengine, so it's in python. I'd like to be able to classify the returned search results from twitter and I'd like to do that in python. I haven't been able to find such sentiment analyzer so far, specifically not in python. Are you familiar with such open source implementation I can use? Preferably this is already in python, but if not, hopefully I can translate it to python. Note, the texts I'm analyzing are VERY short, they are tweets. So ideally, this classifier is optimized for such short texts. BTW, twitter does support the ":)" and ":(" operators in search, which aim to do just this, but unfortunately, the classification provided by them isn't that great, so I figured I might give this a try myself. Thanks! BTW, an early demo is here and the code I have so far is here and I'd love to opensource it with any interested developer.

    Read the article

  • Usage of static analysis tools - with Clear Case/Quest

    - by boyd4715
    We are in the process of defining our software development process and wanted to get some feed back from the group about this topic. Our team is spread out - US, Canada and India - and I would like to put into place some simple standard rules that all teams will apply to their code. We make use of Clear Case/Quest and RAD I have been looking at PMD, CPP, checkstyle and FindBugs as a start. My thought is to just put these into ANT and have the developers run these manually. I realize doing this you have to have some trust in that each developer will do this. The other thought is to add in some builders in to the IDE which would run a subset of the rules (keep the build process light) and then add another set (heavy) when they check in the code. Some other ideals is to make use of something like Cruse Control and have it set up to run these static analysis tools along with the unit test when ever Clear Case/Quest is idle. Wondering if others have done this and if it was successfully or can provide lessons learned.

    Read the article

  • exclude dependencies when running sonar analysis

    - by achraf
    I have a test project requiring some heavy jars which i put in ${M2_HOME}\test\src\main\resources\ and add them in the pom.xml using : <dependency> <groupId>server</groupId> <artifactId>server</artifactId> <version>1.0</version> <scope>system</scope> <systemPath>${M2_HOME}\test\src\main\resources\server.jar</systemPath> </dependency> <dependency> <groupId>client</groupId> <artifactId>client</artifactId> <version>6.0</version> <scope>system</scope> <systemPath>${M2_HOME}\test\src\main\resources\client.jar</systemPath> </dependency> I want to know if it possible to exclude them during sonar analysis, or generally just analyze java sources folder.

    Read the article

  • Syntactical analysis with Flex/Bison part 2

    - by Imran
    Hallo, I need help in Lex/Yacc Programming. I wrote a compiler for a syntactical analysis for inputs of many statements. Now i have a special problem. In case of an Input the compiler gives the right output, which statement is uses, constant operator or a jmp instructor to which label, now i have to write so, if now a if statement comes, first the first command (before the else) must be give out when the assignment of the if is yes then it must jump to the end because the command after the else isnt needed, so after this jmp then the second command must be give out. I show it in an example maybe you understand what i mean. Input adr. Output if(x==0) 10 if(x==0) Wait 5 20 WAIT 5 else 30 JMP 50 Wait 1 40 WAIT 1 end 50 END like so. I have an idea, maybe i can do it whith a special if statement like IF exp jmp_stmt_end stmt_seq END when the if statement is given in the input the compiler has to recognize the end ofthe statement and like my jmp_stmt in my compiler ( you have to download the files from http://bitbucket.org/matrix/changed-tiny) only to jump to the end. I hope you understand my problem.thanks.

    Read the article

  • visual analysis of web pages in ruby

    - by Clint Miller
    I'm looking to write some code that does visual analysis of web pages, preferably using Ruby. My code will need to be able to determine the top, left, width, height, background color, color, and font size for all the elements in the DOM. Of course, these values can only be calculated once all CSS is applied. So, I don't think that Nokogiri is up for the job. Ultimately, I'm trying to use this data in a VIPS-like (Vision-Based Page Segmentation) algorithm in an attempt to find the main content in downloaded news articles. I've considered using Watir to drive Chrome or Firefox and then extract the data. The problem is that browsers can't be run headless through Watir (I think). Ultimately, this code will be running on an array of Linux servers in a data center. So, the code won't have easy access to an X Server for displaying the browser. I suppose one solution is to use Watir and run a headless X Server on the Linux servers. That's a bit of a pain, but it looks like my best option right now. Does anyone have any better ideas?

    Read the article

  • Visual Studio 2010 and .NET 4 Released

    - by ScottGu
    The final release of Visual Studio 2010 and .NET 4 is now available. Download and Install Today MSDN subscribers, as well as WebsiteSpark/BizSpark/DreamSpark members, can now download the final releases of Visual Studio 2010 and TFS 2010 through the MSDN subscribers download center.  If you are not an MSDN Subscriber, you can download free 90-day trial editions of Visual Studio 2010.  Or you can can download the free Visual Studio express editions of Visual Web Developer 2010, Visual Basic 2010, Visual C# 2010 and Visual C++.  These express editions are available completely for free (and never time out).  If you are looking for an easy way to setup a new machine for web-development you can automate installing ASP.NET 4, ASP.NET MVC 2, IIS, SQL Server Express and Visual Web Developer 2010 Express really quickly with the Microsoft Web Platform Installer (just click the install button on the page). What is new with VS 2010 and .NET 4 Today’s release is a big one – and brings with it a ton of new feature and capabilities. One of the things we tried hard to focus on with this release was to invest heavily in making existing applications, projects and developer experiences better.  What this means is that you don’t need to read 1000+ page books or spend time learning major new concepts in order to take advantage of the release.  There are literally thousands of improvements (both big and small) that make you more productive and successful without having to learn big new concepts in order to start using them.  Below is just a small sampling of some of the improvements with this release: Visual Studio 2010 IDE  Visual Studio 2010 now supports multiple-monitors (enabling much better use of screen real-estate).  It has new code Intellisense support that makes it easier to find and use classes and methods. It has improved code navigation support for searching code-bases and seeing how code is called and used.  It has new code visualization support that allows you to see the relationships across projects and classes within projects, as well as to automatically generate sequence diagrams to chart execution flow.  The editor now supports HTML and JavaScript snippet support as well as improved JavaScript intellisense. The VS 2010 Debugger and Profiling support is now much, much richer and enables new features like Intellitrace (aka Historical Debugging), debugging of Crash/Dump files, and better parallel debugging.  VS 2010’s multi-targeting support is now much richer, and enables you to use VS 2010 to target .NET 2, .NET 3, .NET 3.5 and .NET 4 applications.  And the infamous Add Reference dialog now loads much faster. TFS 2010 is now easy to setup (you can now install the server in under 10 minutes) and enables great source-control, bug/work-item tracking, and continuous integration support.  Testing (both automated and manual) is now much, much richer.  And VS 2010 Premium and Ultimate provide much richer architecture and design tooling support. VB and C# Language Features VB and C# in VS 2010 both contain a bunch of new features and capabilities.  VB adds new support for automatic properties, collection initializers, and implicit line continuation support among many other features.  C# adds support for optional parameters and named arguments, a new dynamic keyword, co-variance and contra-variance, and among many other features. ASP.NET 4 and ASP.NET MVC 2 With ASP.NET 4, Web Forms controls now render clean, semantically correct, and CSS friendly HTML markup. Built-in URL routing functionality allows you to expose clean, search engine friendly, URLs and increase the traffic to your Website.  ViewState within applications can now be more easily controlled and made smaller.  ASP.NET Dynamic Data support has been expanded.  More controls, including rich charting and data controls, are now built-into ASP.NET 4 and enable you to build applications even faster.  New starter project templates now make it easier to get going with new projects.  SEO enhancements make it easier to drive traffic to your public facing sites.  And web.config files are now clean and simple. ASP.NET MVC 2 is now built-into VS 2010 and ASP.NET 4, and provides a great way to build web sites and applications using a model-view-controller based pattern. ASP.NET MVC 2 adds features to easily enable client and server validation logic, provides new strongly-typed HTML and UI-scaffolding helper methods.  It also enables more modular/reusable applications.  The new <%: %> syntax in ASP.NET makes it easier to HTML encode output.  Visual Studio 2010 also now includes better tooling support for unit testing and TDD.  In particular, “Consume first intellisense” and “generate from usage" support within VS 2010 make it easier to write your unit tests first, and then drive your implementation from them. Deploying ASP.NET applications gets a lot easier with this release. You can now publish your Websites and applications to a staging or production server from within Visual Studio itself. Visual Studio 2010 makes it easy to transfer all your files, code, configuration, database schema and data in one complete package. VS 2010 also makes it easy to manage separate web.config configuration files settings depending upon whether you are in debug, release, staging or production modes. WPF 4 and Silverlight 4 WPF 4 includes a ton of new improvements and capabilities including more built-in controls, richer graphics features (cached composition, pixel shader 3 support, layoutrounding, and animation easing functions), a much improved text stack (with crisper text rendering, custom dictionary support, and selection and caret brush options).  WPF 4 also includes a bunch of support to enable you to take advantage of new Windows 7 features – including multi-touch and Windows 7 shell integration. Silverlight 4 will launch this week as well.  You can watch my Silverlight 4 launch keynote streamed live Tuesday (April 13th) at 8am Pacific Time.  Silverlight 4 includes a ton of new capabilities – including a bunch for making it possible to build great business applications and out of the browser applications.  I’ll be doing a separate blog post later this week (once it is live on the web) that talks more about its capabilities. Visual Studio 2010 now includes great tooling support for both WPF and Silverlight.  The new VS 2010 WPF and Silverlight designer makes it much easier to build client applications as well as build great line of business solutions, as well as integrate and bind with data.  Tooling support for Silverlight 4 with the final release of Visual Studio 2010 will be available when Silverlight 4 releases to the web this week. SharePoint and Azure Visual Studio 2010 now includes built-in support for building SharePoint applications.  You can now create, edit, build, and debug SharePoint applications directly within Visual Studio 2010.  You can also now use SharePoint with TFS 2010. Support for creating Azure-hosted applications is also now included with VS 2010 – allowing you to build ASP.NET and WCF based applications and host them within the cloud. Data Access Data access has a lot of improvements coming to it with .NET 4.  Entity Framework 4 includes a ton of new features and capabilities – including support for model first and POCO development, default support for lazy loading, built-in support for pluralization/singularization of table/property names within the VS 2010 designer, full support for all the LINQ operators, the ability to optionally expose foreign keys on model objects (useful for some stateless web scenarios), disconnected API support to better handle N-Tier and stateless web scenarios, and T4 template customization support within VS 2010 to allow you to customize and automate how code is generated for you by the data designer.  In addition to improvements with the Entity Framework, LINQ to SQL with .NET 4 also includes a bunch of nice improvements.  WCF and Workflow WCF includes a bunch of great new capabilities – including better REST, activation and configuration support.  WCF Data Services (formerly known as Astoria) and WCF RIA Services also now enable you to easily expose and work with data from remote clients. Windows Workflow is now much faster, includes flowchart services, and now makes it easier to make custom services than before.  More details can be found here. CLR and Core .NET Library Improvements .NET 4 includes the new CLR 4 engine – which includes a lot of nice performance and feature improvements.  CLR 4 engine now runs side-by-side in-process with older versions of the CLR – allowing you to use two different versions of .NET within the same process.  It also includes improved COM interop support.  The .NET 4 base class libraries (BCL) include a bunch of nice additions and refinements.  In particular, the .NET 4 BCL now includes new parallel programming support that makes it much easier to build applications that take advantage of multiple CPUs and cores on a computer.  This work dove-tails nicely with the new VS 2010 parallel debugger (making it much easier to debug parallel applications), as well as the new F# functional language support now included in the VS 2010 IDE.  .NET 4 also now also has the Dynamic Language Runtime (DLR) library built-in – which makes it easier to use dynamic language functionality with .NET.  MEF – a really cool library that enables rich extensibility – is also now built-into .NET 4 and included as part of the base class libraries.  .NET 4 Client Profile The download size of the .NET 4 redist is now much smaller than it was before (the x86 full .NET 4 package is about 36MB).  We also now have a .NET 4 Client Profile package which is a pure sub-set of the full .NET that can be used to streamline client application installs. C++ VS 2010 includes a bunch of great improvements for C++ development.  This includes better C++ Intellisense support, MSBuild support for projects, improved parallel debugging and profiler support, MFC improvements, and a number of language features and compiler optimizations. My VS 2010 and .NET 4 Blog Series I’ve been cranking away on a blog series the last few months that highlights many of the new VS 2010 and .NET 4 improvements.  The good news is that I have about 20 in-depth posts already written.  The bad news (for me) is that I have about 200 more to go until I’m done!  I’m going to try and keep adding a few more each week over the next few months to discuss the new improvements and how best to take advantage of them. Below is a list of the already written ones that you can check out today: Clean Web.Config Files Starter Project Templates Multi-targeting Multiple Monitor Support New Code Focused Web Profile Option HTML / ASP.NET / JavaScript Code Snippets Auto-Start ASP.NET Applications URL Routing with ASP.NET 4 Web Forms Searching and Navigating Code in VS 2010 VS 2010 Code Intellisense Improvements WPF 4 Add Reference Dialog Improvements SEO Improvements with ASP.NET 4 Output Cache Extensibility with ASP.NET 4 Built-in Charting Controls for ASP.NET and Windows Forms Cleaner HTML Markup with ASP.NET 4 - Client IDs Optional Parameters and Named Arguments in C# 4 - and a cool scenarios with ASP.NET MVC 2 Automatic Properties, Collection Initializers and Implicit Line Continuation Support with VB 2010 New <%: %> Syntax for HTML Encoding Output using ASP.NET 4 JavaScript Intellisense Improvements with VS 2010 Stay tuned to my blog as I post more.  Also check out this page which links to a bunch of great articles and videos done by others. VS 2010 Installation Notes If you have installed a previous version of VS 2010 on your machine (either the beta or the RC) you must first uninstall it before installing the final VS 2010 release.  I also recommend uninstalling .NET 4 betas (including both the client and full .NET 4 installs) as well as the other installs that come with VS 2010 (e.g. ASP.NET MVC 2 preview builds, etc).  The uninstalls of the betas/RCs will clean up all the old state on your machine – after which you can install the final VS 2010 version and should have everything just work (this is what I’ve done on all of my machines and I haven’t had any problems). The VS 2010 and .NET 4 installs add a bunch of new managed assemblies to your machine.  Some of these will be “NGEN’d” to native code during the actual install process (making them run fast).  To avoid adding too much time to VS setup, though, we don’t NGEN all assemblies immediately – and instead will NGEN the rest in the background when your machine is idle.  Until it finishes NGENing the assemblies they will be JIT’d to native code the first time they are used in a process – which for large assemblies can sometimes cause a slight performance hit. If you run into this you can manually force all assemblies to be NGEN’d to native code immediately (and not just wait till the machine is idle) by launching the Visual Studio command line prompt from the Windows Start Menu (Microsoft Visual Studio 2010->Visual Studio Tools->Visual Studio Command Prompt).  Within the command prompt type “Ngen executequeueditems” – this will cause everything to be NGEN’d immediately. How to Buy Visual Studio 2010 You can can download and use the free Visual Studio express editions of Visual Web Developer 2010, Visual Basic 2010, Visual C# 2010 and Visual C++.  These express editions are available completely for free (and never time out). You can buy a new copy of VS 2010 Professional that includes a 1 year subscription to MSDN Essentials for $799.  MSDN Essentials includes a developer license of Windows 7 Ultimate, Windows Server 2008 R2 Enterprise, SQL Server 2008 DataCenter R2, and 20 hours of Azure hosting time.  Subscribers also have access to MSDN’s Online Concierge, and Priority Support in MSDN Forums. Upgrade prices from previous releases of Visual Studio are also available.  Existing Visual Studio 2005/2008 Standard customers can upgrade to Visual Studio 2010 Professional for a special $299 retail price until October.  You can take advantage of this VS Standard->Professional upgrade promotion here. Web developers who build applications for others, and who are either independent developers or who work for companies with less than 10 employees, can also optionally take advantage of the Microsoft WebSiteSpark program.  This program gives you three copies of Visual Studio 2010 Professional, 1 copy of Expression Studio, and 4 CPU licenses of both Windows 2008 R2 Web Server and SQL 2008 Web Edition that you can use to both develop and deploy applications with at no cost for 3 years.  At the end of the 3 years there is no obligation to buy anything.  You can sign-up for WebSiteSpark today in under 5 minutes – and immediately have access to the products to download. Summary Today’s release is a big one – and has a bunch of improvements for pretty much every developer.  Thank you everyone who provided feedback, suggestions and reported bugs throughout the development process – we couldn’t have delivered it without you.  Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Cepstral Analysis for pitch detection

    - by Ohmu
    Hi! I'm looking to extract pitches from a sound signal. Someone on IRC just explain to me how taking a double FFT achieves this. Specifically: take FFT take log of square of absolute value (can be done with lookup table) take another FFT take absolute value I am attempting this using vDSP I can't understand how I didn't come across this technique earlier. I did a lot of hunting and asking questions; several weeks worth. More to the point, I can't understand why I didn't think of it. I am attempting to achieve this with vDSP library. it looks as though it has functions to handle all of these tasks. However, I'm wondering about the accuracy of the final result. I have previously used a technique which scours the frequency bins of a single FFT for local maxima. when it encounters one, it uses a cunning technique (the change in phase since the last FFT) to more accurately place the actual peak within the bin. I am worried that this precision will be lost with this technique I'm presenting here. I guess the technique could be used after the second FFT to get the fundamental accurately. But it kind of looks like the information is lost in step 2. as this is a potentially tricky process, could someone with some experience just look over what I'm doing and check it for sanity? also, I've heard there is an alternative technique involving fitting a quadratic over neighbouring bins. Is this of comparable accuracy? if so, I would favour it, as it doesn't involve remembering bin phases. so questions: does this approach makes sense? Can it be improved? I'm a bit worried about And the log square component; there seems to be a vDSP function to do exactly that: vDSP_vdbcon however, there is no indication it precalculates a log-table -- I assume it doesn't, as the FFT function requires an explicit pre-calculation function to be called and passed into it. and this function doesn't. Is there some danger of harmonics being picked up? is there any cunning way of making vDSP pull out the maxima, biggest first? Can anyone point me towards some research or literature on this technique? the main question: is it accurate enough? Can the accuracy be improved? I have just been told by an expert that the accuracy IS INDEED not sufficient. Is this the end of the line? Pi PS I get SO annoyed (npi) when I want to create tags, but cannot. :| I have suggested to the maintainers that SO keep track of attempted tags, but I'm sure I was ignored. we need tags for vDSP, accelerate framework, cepstral analysis

    Read the article

  • SQL SERVER – Developer Training Kit for SQL Server 2012

    - by pinaldave
    Developer Training Kit is my favorite part of any product. The reason behind is very simple because it give the single resource which gives complete overview of the product in nutshell. A developer can learn from many places – books, webcasts, tutorials, blogs, etc. However, I have found that developer training kits are the best starting point for any product. Start with them first, see what are the new features as well what is the new message a product is coming up with. Once it is learned the very next step should be to identify the right learning material to explore the preferred topic. The SQL Server 2012 Developer Training Kit includes technical content including labs, demos and presentations designed to help you learn how to develop SQL Server 2012 database and BI solutions. New and updated content will be released periodically and can be downloaded on-demand using the Web Installer. Download SQL Server 2012 Developer Training Kit Web Installer. This training kit was available earlier this year but it is never late to explore it if you have not referred it earlier. Additionally, if you do not want to download complete kit all together I suggest you refer to Wiki here. This wiki contains all the same presentations and demo notes which web installer contains. Refer to SQL Server 2012 Developer Training Kit Wiki Wiki contains following module and details about Hands On Labs Module 1: Introduction to SQL Server 2012 Module 2: Introduction to SQL Server 2012 AlwaysOn Module 3: Exploring and Managing SQL Server 2012 Database Engine Improvements Module 4: SQL Server 2012 Database Server Programmability Module 5: SQL Server 2012 Application Development Module 6: SQL Server 2012 Enterprise Information Management Module 7: SQL Server 2012 Business Intelligence Hands-On Labs: SQL Server 2012 Database Engine Hands-On Labs: Visual Studio 2010 and .NET 4.0 Hands-On Labs: SQL Server 2012 Enterprise Information Management Hands-On Labs: SQL Server 2012 Business Intelligence Hands-On LabsHands-On Labs: Windows Azure and SQL Azure As I said, if you have not downloaded this so far, it is never late to explore it. Trust me you will atleast learn one thing if you just explore the content. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Developer Training, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • SQL SERVER – Automated Type Conversion using Expressor Studio

    - by pinaldave
    Recently I had an interesting situation during my consultation project. Let me share to you how I solved the problem using Expressor Studio. Consider a situation in which you need to read a field, such as customer_identifier, from a text file and pass that field into a database table. In the source file’s metadata structure, customer_identifier is described as a string; however, in the target database table, customer_identifier is described as an integer. Legitimately, all the source values for customer_identifier are valid numbers, such as “109380”. To implement this in an ETL application, you probably would have hard-coded a type conversion function call, such as: output.customer_identifier=stringToInteger(input.customer_identifier) That wasn’t so bad, was it? For this instance, programming this hard-coded type conversion function call was relatively easy. However, hard-coding, whether type conversion code or other business rule code, almost always means that the application containing hard-coded fields, function calls, and values is: a) specific to an instance of use; b) is difficult to adapt to new situations; and c) doesn’t contain many reusable sub-parts. Therefore, in the long run, applications with hard-coded type conversion function calls don’t scale well. In addition, they increase the overall level of effort and degree of difficulty to write and maintain the ETL applications. To get around the trappings of hard-coding type conversion function calls, developers need an access to smarter typing systems. Expressor Studio product offers this feature exactly, by providing developers with a type conversion automation engine based on type abstraction. The theory behind the engine is quite simple. A user specifies abstract data fields in the engine, and then writes applications against the abstractions (whereas in most ETL software, developers develop applications against the physical model). When a Studio-built application is run, Studio’s engine automatically converts the source type to the abstracted data field’s type and converts the abstracted data field’s type to the target type. The engine can do this because it has a couple of built-in rules for type conversions. So, using the example above, a developer could specify customer_identifier as an abstract data field with a type of integer when using Expressor Studio. Upon reading the string value from the text file, Studio’s type conversion engine automatically converts the source field from the type specified in the source’s metadata structure to the abstract field’s type. At the time of writing the data value to the target database, the engine doesn’t have any work to do because the abstract data type and the target data type are just the same. Had they been different, the engine would have automatically provided the conversion. ?Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Database, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: SSIS

    Read the article

  • SQL Contest – Win USD 300 Worth Gift – Cartoon Contest is Back

    - by pinaldave
    There are two excellent contests and we have lots of winning to do this year end. 1) Win USD 25 Amazon Gift Cards (10 Units) This is very simple, you just have to download SQL Server DB Optimizer. That’s it! There are only two conditions: You must have a valid email address. As USD 25 Amazon Gift Card will be sent to the same address. Download DB Optimizer between today and Dec 8, 2012. Link to Download DB Optimizer. Every day one winner will be notified about their winning USD 25 Amazon Gift Cards for next 10 days. 2) Win Star Wars R2-D2 Inflatable R/C This the coolest thing to win. I personally want one but as I am running a contest, I can’t  participate. You get this cool Remote Controlled Device – you just have to answer following cartoon contest. Read the complete story and think what will be the answer provided by the smart employee. There are only two conditions: Leave your answer in the comment area of this blog post (every comment will be hidden till Dec 8, 2012). Please leave your answer in the comment area between today and Dec 8, 2012. Remember you can participate as many times as you want. Make sure that your answer is correct and creative. The most creative answer will be selected. The decision of contest owner will be final. We may have runner’s up prices but for the moment let us try to win R2-D2. Here is the cool video of R2D2. Now here is the cartoon story, please follow the story and complete the very last cartoon template. Your answer should be correct and should be creative. However, the ideal answer will not be longer than one or two sentences. Hint: (Hint) Well, Leave your answer in the comment area of this blog post. If you do not win R2D2, trust me there are chances you may win a surprise gift from me. Remember your answer should be correct and should be creative. However, the ideal answer will not be longer than one or two sentences. Last day to participate in both of the contest is Dec 8, 2012. We will announce the winner in the week of December 10. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • SQL SERVER – Public Training and Private Training – Differences and Similarities

    - by pinaldave
    Earlier this year, I was on Road SQL Server Seminars. I did many SQL Server Performance Trainings and SQL Server Performance Consultations throughout the year but I feel the most rewarding exercise is always the one when instructor learns something from students, too. I was just talking to my wife, Nupur – she manages my logistics and administration related activities – and she pointed out that this year I have done 62% consultations and 38% trainings. I was bit surprised as I thought the numbers would be reversed. Every time I review the year, I think of training done at organizations. Well, I cannot argue with reality, I have done more consultations (some would call them projects) than training. I told my wife that I enjoy consultations more than training. She promptly asked me a question which was not directly related but made me think for long time, and in the end resulted in this blog post. Nupur asked me: what do I enjoy the most, public training or private training? I had a long conversation with her on this subject. I am not going to write long blog post which can change your life here. This is rather a small post condensing my one hour discussion into 200 words. Public Training is fun because… There are lots of different kinds of attendees There are always vivid questions Lots of questions on questions Less interest in theory and more interest in demos Good opportunity of future business Private Training is fun because… There is a focused interest One question is discussed deeply because of existing company issues More interest in “how it happened” concepts – under the hood operations Good connection with attendees This is also a good opportunity of future business Here I will stop my monologue and I want to open up this question to all of you: Question to Attendees - Which one do you enjoy the most – Public Training or Private Training? Question to Trainers - What do you enjoy the most – Public Training or Private Training? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Optimization, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQL Training, SQLAuthority News, T SQL, Technology

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >