Search Results

Search found 3383 results on 136 pages for 'telerik reporting'.

Page 104/136 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Print SSRS Report / PDF automatically from SQL Server agent or Windows Service

    - by Jeremy Ramos
    Originally posted on: http://geekswithblogs.net/JeremyRamos/archive/2013/10/22/print-ssrs-report--pdf-from-sql-server-agent-or.aspxI have turned the Web upside-down to find a solution to this considering the least components and least maintenance as possible to achieve automated printing of an SSRS report. This is for the reason that we do not have a full software development team to maintain an app and we have to minimize the support overhead for the support team.Here is my setup:SQL Server 2008 R2 in Windows Server 2008 R2PDF format reports generated by SSRS Reports subscriptions to a Windows File ShareNetwork printerColoured reports with logo and brandingI have found and tested the following solutions to no avail:ProsConsCalling Adobe Acrobat Reader exe: "C:\Program Files (x86)\Adobe\Reader 11.0\Reader\acroRd32.exe" /n /s /o /h /t "C:\temp\print.pdf" \\printserver\printername"Very simple optionAdobe Acrobat reader requires to launch the GUI to send a job to a printer. Hence, this option cannot be used when printing from a service.Calling Adobe Acrobat Reader exe as a process from a .NET console appA bit harder than above, but still a simple solutionSame as cons abovePowershell script(Start-Process -FilePath "C:\temp\print.pdf" -Verb Print)Very simple optionUses default PDF client in quiet mode to Print, but also requires an active session.    Foxit ReaderVery simple optionRequires GUI same as Adobe Acrobat Reader Using the Reporting Services Web service to run and stream the report to an image object and then passed to the printerQuite complexThis is what we're trying to avoid  After pulling my hair out for two days, testing and evaluating the above solutions, I ended up learning more about printers (more than ever in my entire life) and how printer drivers work with PostScripts. I then bumped on to a PostScript interpreter called GhostScript (http://www.ghostscript.com/) and then the solution starts to get clearer and clearer.I managed to achieve a solution (maybe not be the simplest but efficient enough to achieve the least-maintenance-least-components goal) in 3-simple steps:Install GhostScript (http://www.ghostscript.com/download/) - this is an open-source PostScript and PDF interpreter. Printing directly using GhostScript only produces grayscale prints using the laserjet generic driver unless you save as BMP image and then interpret the colours using the imageInstall GSView (http://pages.cs.wisc.edu/~ghost/gsview/)- this is a GhostScript add-on to make it easier to directly print to a Windows printer. GSPrint automates the above  PDF -> BMP -> Printer Driver.Run the GSPrint command from SQL Server agent or Windows Service:"C:\Program Files\Ghostgum\gsview\gsprint.exe" -color -landscape -all -printer "printername" "C:\temp\print.pdf"Command line options are here: http://pages.cs.wisc.edu/~ghost/gsview/gsprint.htmAnother lesson learned is, since you are calling the script from the Service Account, it will not necessarily have the Printer mapped in its Windows profile (if it even has one). The workaround to this is by adding a local printer as you normally would and then map this printer to the network printer. Note that you may need to install the Printer Driver locally in the server.So, that's it! There are many ways to achieve a solution. The key thing is how you provide the smartest solution!

    Read the article

  • Don&rsquo;t Forget! In-Memory Databases are Hot

    - by andrewbrust
    If you’re left scratching your head over SAP’s intention to acquire Sybase for almost $6 million, you’re not alone.  Despite Sybase’s 1990s reign as the supreme database standard in certain sectors (including Wall Street), the company’s flagship product has certainly fallen from grace.  Why would SAP pay a greater than 50% premium over Sybase’s closing price on the day of the announcement just to acquire a relational database which is firmly stuck in maintenance mode? Well there’s more to Sybase than the relational database product.  Take, for example, its mobile application platform.  It hit Gartner’s “Leaders’ Quadrant” in January of last year, and SAP needs a good mobile play.  Beyond the platform itself, Sybase has a slew of mobile services; click this link to look them over. There’s a second major asset that Sybase has though, and I wonder if it figured prominently into SAP’s bid: Sybase IQ.  Sybase IQ is a columnar database.  Columnar databases place values from a given database column contiguously, unlike conventional relational databases, which store all of a row’s data in close proximity.  Storing column values together works well in aggregation reporting scenarios, because the figures to be aggregated can be scanned in one efficient step.  It also makes for high rates of compression because values from a single column tend to be close to each other in magnitude and may contain long sequences of repeating values.  Highly compressible databases use much less disk storage and can be largely or wholly loaded into memory, resulting in lighting fast query performance.  For an ERP company like SAP, with its own legacy BI platform (SAP BW) and the entire range of Business Objects and Crystal Reports BI products (which it acquired in 2007) query performance is extremely important. And it’s a competitive necessity too.  QlikTech has built an entire company on a columnar, in-memory BI product (QlikView).  So too has startup company Vertica.  IBM’s TM1 product has been doing in-memory OLAP for years.  And guess who else has the in-memory religion?  Microsoft does, in the form of its new PowerPivot product.  I expect the technology in PowerPivot to become strategic to the full-blown SQL Server Analysis Services product and the entire Microsoft BI stack.  I sure don’t blame SAP for jumping on the in-memory bandwagon, if indeed the Sybase acquisition is, at least in part, motivated by that. It will be interesting to watch and see what SAP does with Sybase’s product line-up (assuming the acquisition closes), including the core database, the mobile platform, IQ, and even tools like PowerBuilder.  It is also fascinating to watch columnar’s encroachment on relational.  Perhaps this acquisition will be columnar’s tipping point and people will no longer see it as a fad.  Are you listening Larry Ellison?

    Read the article

  • JavaOne in Brazil

    - by janice.heiss(at)oracle.com
    JavaOne in Brazil, currently taking place in Sao Paolo, is one event I'd love to attend. I once heard "father of Java" James Gosling talk about Java developers throughout the world. He observed that there were good developers everywhere. It was not the case, he said, that that the really good developers are in one place and the not-so-good developers are in another. He encountered excellent developers everywhere. Then he paused and said that the craziest developers were definitely the Brazilians. As anyone who knows James would realize, this was meant as high praise. He said the Brazilians would work through the night on projects and were very enthusiastic and spontaneous - features that Brazilian culture is known for. Brazilian developers are responsible for creating one of the most impressive uses of Java ever - the applications that run the Brazilian health services. Starting from scratch they created a system that enables an expert doctor in Rio to look at an X-Ray of a patient near the Amazon and offer advice. One of the main architects of this was Java Champion Fabinane Nardon the distinguished Brazilian Java architect and open-source evangelist. As she writes in her blog:"In 2003, I was invited to assemble a team and architect a Public Healthcare Information System for the city of São Paulo, the largest in Latin America, with 14 million inhabitants. The resulting software had 2.5 million of lines of code and it was created, from specification to production, in only 10 months. At the time, the software was considered the largest J2EE application in the world and was featured in several articles, as this one. As a result, we won the Duke's Choice Award in 2005 during JavaOne, the largest development conference in the world. At the time, Sun Microsystems make a short documentary about our work." "In 2007, a lightning struck twice and I was again invited to assemble a new team and architect an even larger information system for healthcare. And thus I became CTO and one of the founders of Zilics Healthcare Information Systems. "In 2010, I started to research and work on Cloud Computing technology and became leader of the LSI-TEC Cloud Computing group. LSI-TEC is a research laboratory in the University of Sao Paulo, one of the best in Brazil. Thus, I became one of the ghost writers behind the popular Cloud Computing Twitter @the_cloud."You can see and hear Nardon in a 4 minute documentary on Java and the Brazilian health care system produced by Sun Microsystems. And you can listen to a September 2010 podcast with Nardon and her fellow Brazilian Java Champion Bruno Souza (known in Brazil as "Java Man") here at 11:10 minutes into the podcast.Next year, I'll hope to be reporting in Brazil at JavaOne!

    Read the article

  • Resources such as libraries, engines and frameworks to make Javacript-based MMORTS? [closed]

    - by hhh
    I am looking for resources outlined to make a MMORTS with Javascript as the client-side, probably just a simple canvas for the frontend. The guy in the video here mentions that JavaScript is one of the most misunderstood language -- and I do believe that. I think one can make quite cool games with it in the future. So I am now proactively looking for resources and perhaps some ideas. My first idea contained Node.js, C and NetBSD/bozohttpd (or the-4-7-chars' *ix-thing with green-logo -thing, move the q here) but I acknowledge my beginner -style approach -- this issue is broad and not only for one person to make it all-the-time-improved project! So I think perfect for community to tinker. Some games and examples possibly easy to make into MMORTS BrowserQuest here under MPL 2.0 and its content licensed under CC-BY-SA 3.0 (source here) [proprietary] LoU here and built with JS/Qooxdoo/c#/Windows-Server/ISS/etc, source. MY ANSWER BEGINS HERE TO BE MOVED BELOW, REQUIRING RE-OPENING. PLEASE, VOTE TO OPEN IT -- HELP US TO TINKER! My answer Generic Is there an MMO-related research body? Although about Android, certain things also appropriate with JS -game: Are there any 2D gaming libraries/frameworks/engines for Android? Why is it so hard to develop a MMO? Browser based MMO Architecture MMO architecture - Highly Scalable with Reporting capabilities What are the Elements of an MMO Game? Is this the right architecture for our MMORPG mobile game? Looking for architectures to develop massive multiplayer game server Information on seamless MMO server architecture Game-mechanics (search) Question sounding like about LoU: What are the different ways to balance an online multiplayer game where user spend different amounts of time online? Building an instance system What are the different ways to balance an online multiplayer game where user spend different amounts of time online? Hosting is it possible to make a MMO starting with scalable hosting? Should I keep login server apart from game server? MMO techniques, algorithms and resources for keeping bandwidth low? MMO Proxy Server Javascript and Client-based things What do I need to do a MMORTS in JavaScript with small amount of Developers? How to update the monsters in my MMO server using Node.js and Socket.IO Are there any good html 5 mmo design tutorials? Networking Loadbalancing Questions Something about TCP, routers, NAT, etc: How do I start writing an MMO game server? Who does the AI calculations in an MMO? They need someone more knowledgable to work with, a lot of cases where the same words mean different things. Data Structures What data structure should I use for a Diablo/WoW-style talent tree? Game Engine Need an engine for MMO mockup Helper sites http://www.gamedev.net/page/index.html

    Read the article

  • AxCMS.net 10 with Microsoft Silverlight 4 and Microsoft Visual Studio 2010

    - by Axinom
    Axinom, European WCM vendor, today announced the next version of its WCM solution AxCMS.net 10, which streamlines the processes involved in creating, managing and distributing corporate content on the internet. The new solution helps reducing ongoing costs for managing and distributing to large audiences, while at the same time drastically reducing time-to-market and one-time setup costs. http://www.AxCMS.net Axinom’s WCM portfolio, based on the Microsoft .NET Framework 4, Microsoft Visual Studio 2010 and Microsoft Silverlight 4, allows enterprises to increase process efficiency, reduce operating costs and more effectively manage delivery of rich media assets on the Web and mobile devices. Axinom solutions are widely used by major European online brands in IT, telco, retail, media and entertainment industries such as Siemens, American Express, Microsoft Corp., ZDF, Pro7Sat1 Media, and Deutsche Post. Brand New User Interface built with Silverlight 4By using Silverlight 4, Axinom’s team created a new user interface for AxCMS.net 10 that is optimized for improved usability and speed. WYSIWYG mode, integrated image editor, extended list views, and detail views of objects allow a substantial acceleration of typical editor tasks. Axinom’s team worked with Silverlight Rough Cut Editor for video management and Silverlight Analytics Framework for extended reporting to complete the wide range of capabilities included in the new release. “Axinom’s release of AxCMS.net 10 enables developers to take advantage of the latest features in Silverlight 4,” said Brian Goldfarb, director of the developer platform group at Microsoft Corp. “Microsoft is excited about the opportunity this creates for Web developers to streamline the creating, managing and distributing of online corporate content using AxCMS.net 10 and Silverlight.” Rapid Web Development with Visual Studio 2010AxCMS.net 10 is extended by additional products that enable developers to get productive quickly and help solve typical customer scenarios. AxCMS.net template projects come with documented source code that help kick-start projects and learn best practices in all aspects of Web application development. AxCMS.net overcomes many hard-to-solve technical obstacles in an out-of-the-box manner by providing a set of ready-to-use vertical solutions such as corporate Web site, Web shop, Web campaign management, email marketing, multi-channel distribution, management of rich Internet applications, and Web business intelligence. Extended Multi-Site ManagementAxCMS.net has been supporting the management of an unlimited number of Web sites for a long time. The new version 10 of AxCMS.net will further improve multi-site management and provide features to editors and developers that will simplify and accelerate multi-site and multi-language management. Extended publication workflow will take into account additional dependencies of dynamic objects, pages, and documents. “The customer requests evolved from static html pages to dynamic Web applications content with the emergence of rich media assets seamlessly combined across many channels including Web, mobile and IPTV. With the.NET Framework 4 and Silverlight 4, we’re on the fast track to making the three screen strategy a reality for our customers,” said Damir Tomicic, CEO of Axinom Group. “Our customers enjoy substantial competitive advantages of using latest Microsoft technologies. We have a long-standing, relationship with Microsoft and are committed to continued development using Microsoft tools and technologies to deliver innovative Web solutions in the future.”  

    Read the article

  • Next Phase of ECM 11g Now Available - New UCM & URM 11g, & Updated I/PM & IRM 11g

    - by michelle.huff
    We're excited to announce that the Oracle Enterprise Content Management Suite 11g is now available! Today, Oracle announced ECM Suite 11g, a part of Fusion Middleware 11gR1 Patchset 2 release, which builds upon the Imaging and Process Management (I/PM) and Information Rights Management (IRM) 11g release earlier this year. Universal Content Management (UCM) and Universal Records Management (URM) 11g are now available with many new features and enhancements. All ECM products are localized into 27 languages, use a single repository, a single installer, centralized administration, and all run on the same Fusion Middleware tech stack. Oracle ECM Suite 11g, is better integrated to fit the way you work, with extreme performance and extreme scalability. Universal Content Management One click Web content management - brings Web content management authoring, design and presentation capabilities directly into how organizations design sites, portals, and custom Web applications. Simply take in the right amount of WCM that meets your needs - all without having to rewrite the application or port it over to a new technology stack or framework. Greater business user empowerment - with next generation desktop integrations and "smart productivity folders", new Web site "design mode" for business users, and enhanced rich media support enabling users to better work with photography, graphics, videos & podcasts created today as well as contribute content within Flash files directly from the Web. Advanced manageability with extreme performance & scalability - centralized system monitoring, installation, logging, performance metrics & diagnostics, with new built in "fast check-in" features, redesigned component management interface - all running on Fusion Middleware infrastructure. Universal Records Management Enhanced user experience: Oracle URM 11g makes records management easier for both business users and records administrators. Simplifications in the end user experience allow the creation of bookmarks into often-used part of the file plan, easy copying of categories and dispositions, and integrated folder and records search. The records management dashboard provides a consolidated view into records administrator tasks and system performance. DoD 5015.02 v3: Oracle URM is fully certified against all part of the US Department of Defense records management standard - baseline, classified, and Freedom of Information and Privacy Act. This enables Federal, state, & local governments & public agencies, as well as private companies, to maintain regulated compliance. Expanded functionality through Oracle integrations: Oracle URM 11g allows for an expanded set of functionality through integration capabilities with other Oracle products. This includes configurable records definition capabilities directly within a UCM instance. An out of the box integration with Oracle BI Publisher provides easily configured and robust reporting. Additionally, 11g offers an out of the box Oracle Secure Enterprise Search integration enabling real time full text discovery across disparate systems in an organization. Read the Press Release Watch the 3 Minute ECM 11g Video Get Up to Speed with the What's New in ECM Suite Datasheet Learn More on OTN with new tutorials, downloads and whitepapers

    Read the article

  • ASP.NET Frameworks and Raw Throughput Performance

    - by Rick Strahl
    A few days ago I had a curious thought: With all these different technologies that the ASP.NET stack has to offer, what's the most efficient technology overall to return data for a server request? When I started this it was mere curiosity rather than a real practical need or result. Different tools are used for different problems and so performance differences are to be expected. But still I was curious to see how the various technologies performed relative to each just for raw throughput of the request getting to the endpoint and back out to the client with as little processing in the actual endpoint logic as possible (aka Hello World!). I want to clarify that this is merely an informal test for my own curiosity and I'm sharing the results and process here because I thought it was interesting. It's been a long while since I've done any sort of perf testing on ASP.NET, mainly because I've not had extremely heavy load requirements and because overall ASP.NET performs very well even for fairly high loads so that often it's not that critical to test load performance. This post is not meant to make a point  or even come to a conclusion which tech is better, but just to act as a reference to help understand some of the differences in perf and give a starting point to play around with this yourself. I've included the code for this simple project, so you can play with it and maybe add a few additional tests for different things if you like. Source Code on GitHub I looked at this data for these technologies: ASP.NET Web API ASP.NET MVC WebForms ASP.NET WebPages ASMX AJAX Services  (couldn't get AJAX/JSON to run on IIS8 ) WCF Rest Raw ASP.NET HttpHandlers It's quite a mixed bag, of course and the technologies target different types of development. What started out as mere curiosity turned into a bit of a head scratcher as the results were sometimes surprising. What I describe here is more to satisfy my curiosity more than anything and I thought it interesting enough to discuss on the blog :-) First test: Raw Throughput The first thing I did is test raw throughput for the various technologies. This is the least practical test of course since you're unlikely to ever create the equivalent of a 'Hello World' request in a real life application. The idea here is to measure how much time a 'NOP' request takes to return data to the client. So for this request I create the simplest Hello World request that I could come up for each tech. Http Handler The first is the lowest level approach which is an HTTP handler. public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public bool IsReusable { get { return true; } } } WebForms Next I added a couple of ASPX pages - one using CodeBehind and one using only a markup page. The CodeBehind page simple does this in CodeBehind without any markup in the ASPX page: public partial class HelloWorld_CodeBehind : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { Response.Write("Hello World. Time is: " + DateTime.Now.ToString() ); Response.End(); } } while the Markup page only contains some static output via an expression:<%@ Page Language="C#" AutoEventWireup="false" CodeBehind="HelloWorld_Markup.aspx.cs" Inherits="AspNetFrameworksPerformance.HelloWorld_Markup" %> Hello World. Time is <%= DateTime.Now %> ASP.NET WebPages WebPages is the freestanding Razor implementation of ASP.NET. Here's the simple HelloWorld.cshtml page:Hello World @DateTime.Now WCF REST WCF REST was the token REST implementation for ASP.NET before WebAPI and the inbetween step from ASP.NET AJAX. I'd like to forget that this technology was ever considered for production use, but I'll include it here. Here's an OperationContract class: [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World" + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } } WCF REST can return arbitrary results by returning a Stream object and a content type. The code above turns the string result into a stream and returns that back to the client. ASP.NET AJAX (ASMX Services) I also wanted to test ASP.NET AJAX services because prior to WebAPI this is probably still the most widely used AJAX technology for the ASP.NET stack today. Unfortunately I was completely unable to get this running on my Windows 8 machine. Visual Studio 2012  removed adding of ASP.NET AJAX services, and when I tried to manually add the service and configure the script handler references it simply did not work - I always got a SOAP response for GET and POST operations. No matter what I tried I always ended up getting XML results even when explicitly adding the ScriptHandler. So, I didn't test this (but the code is there - you might be able to test this on a Windows 7 box). ASP.NET MVC Next up is probably the most popular ASP.NET technology at the moment: MVC. Here's the small controller: public class MvcPerformanceController : Controller { public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } } ASP.NET WebAPI Next up is WebAPI which looks kind of similar to MVC. Except here I have to use a StringContent result to return the response: public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } } Testing Take a minute to think about each of the technologies… and take a guess which you think is most efficient in raw throughput. The fastest should be pretty obvious, but the others - maybe not so much. The testing I did is pretty informal since it was mainly to satisfy my curiosity - here's how I did this: I used Apache Bench (ab.exe) from a full Apache HTTP installation to run and log the test results of hitting the server. ab.exe is a small executable that lets you hit a URL repeatedly and provides counter information about the number of requests, requests per second etc. ab.exe and the batch file are located in the \LoadTests folder of the project. An ab.exe command line  looks like this: ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld which hits the specified URL 100,000 times with a load factor of 20 concurrent requests. This results in output like this:   It's a great way to get a quick and dirty performance summary. Run it a few times to make sure there's not a large amount of varience. You might also want to do an IISRESET to clear the Web Server. Just make sure you do a short test run to warm up the server first - otherwise your first run is likely to be skewed downwards. ab.exe also allows you to specify headers and provide POST data and many other things if you want to get a little more fancy. Here all tests are GET requests to keep it simple. I ran each test: 100,000 iterations Load factor of 20 concurrent connections IISReset before starting A short warm up run for API and MVC to make sure startup cost is mitigated Here is the batch file I used for the test: IISRESET REM make sure you add REM C:\Program Files (x86)\Apache Software Foundation\Apache2.2\bin REM to your path so ab.exe can be found REM Warm up ab.exe -n100 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJsonab.exe -n100 -c20 http://localhost/aspnetperf/api/HelloWorldJson ab.exe -n100 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld ab.exe -n100000 -c20 http://localhost/aspnetperf/handler.ashx > handler.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_CodeBehind.aspx > AspxCodeBehind.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_Markup.aspx > AspxMarkup.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld > Wcf.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldCode > Mvc.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld > WebApi.txt I ran each of these tests 3 times and took the average score for Requests/second, with the machine otherwise idle. I did see a bit of variance when running many tests but the values used here are the medians. Part of this has to do with the fact I ran the tests on my local machine - result would probably more consistent running the load test on a separate machine hitting across the network. I ran these tests locally on my laptop which is a Dell XPS with quad core Sandibridge I7-2720QM @ 2.20ghz and a fast SSD drive on Windows 8. CPU load during tests ran to about 70% max across all 4 cores (IOW, it wasn't overloading the machine). Ideally you can try running these tests on a separate machine hitting the local machine. If I remember correctly IIS 7 and 8 on client OSs don't throttle so the performance here should be Results Ok, let's cut straight to the chase. Below are the results from the tests… It's not surprising that the handler was fastest. But it was a bit surprising to me that the next fastest was WebForms and especially Web Forms with markup over a CodeBehind page. WebPages also fared fairly well. MVC and WebAPI are a little slower and the slowest by far is WCF REST (which again I find surprising). As mentioned at the start the raw throughput tests are not overly practical as they don't test scripting performance for the HTML generation engines or serialization performances of the data engines. All it really does is give you an idea of the raw throughput for the technology from time of request to reaching the endpoint and returning minimal text data back to the client which indicates full round trip performance. But it's still interesting to see that Web Forms performs better in throughput than either MVC, WebAPI or WebPages. It'd be interesting to try this with a few pages that actually have some parsing logic on it, but that's beyond the scope of this throughput test. But what's also amazing about this test is the sheer amount of traffic that a laptop computer is handling. Even the slowest tech managed 5700 requests a second, which is one hell of a lot of requests if you extrapolate that out over a 24 hour period. Remember these are not static pages, but dynamic requests that are being served. Another test - JSON Data Service Results The second test I used a JSON result from several of the technologies. I didn't bother running WebForms and WebPages through this test since that doesn't make a ton of sense to return data from the them (OTOH, returning text from the APIs didn't make a ton of sense either :-) In these tests I have a small Person class that gets serialized and then returned to the client. The Person class looks like this: public class Person { public Person() { Id = 10; Name = "Rick"; Entered = DateTime.Now; } public int Id { get; set; } public string Name { get; set; } public DateTime Entered { get; set; } } Here are the updated handler classes that use Person: Handler public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { var action = context.Request.QueryString["action"]; if (action == "json") JsonRequest(context); else TextRequest(context); } public void TextRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public void JsonRequest(HttpContext context) { var json = JsonConvert.SerializeObject(new Person(), Formatting.None); context.Response.ContentType = "application/json"; context.Response.Write(json); } public bool IsReusable { get { return true; } } } This code adds a little logic to check for a action query string and route the request to an optional JSON result method. To generate JSON, I'm using the same JSON.NET serializer (JsonConvert.SerializeObject) used in Web API to create the JSON response. WCF REST   [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World " + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } [OperationContract] [WebGet(ResponseFormat=WebMessageFormat.Json,BodyStyle=WebMessageBodyStyle.WrappedRequest)] public Person HelloWorldJson() { // Add your operation implementation here return new Person(); } } For WCF REST all I have to do is add a method with the Person result type.   ASP.NET MVC public class MvcPerformanceController : Controller { // // GET: /MvcPerformance/ public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } public JsonResult HelloWorldJson() { return Json(new Person(), JsonRequestBehavior.AllowGet); } } For MVC all I have to do for a JSON response is return a JSON result. ASP.NET internally uses JavaScriptSerializer. ASP.NET WebAPI public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } [HttpGet] public Person HelloWorldJson() { return new Person(); } [HttpGet] public HttpResponseMessage HelloWorldJson2() { var response = new HttpResponseMessage(HttpStatusCode.OK); response.Content = new ObjectContent<Person>(new Person(), GlobalConfiguration.Configuration.Formatters.JsonFormatter); return response; } } Testing and Results To run these data requests I used the following ab.exe commands:REM JSON RESPONSES ab.exe -n100000 -c20 http://localhost/aspnetperf/Handler.ashx?action=json > HandlerJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJson > MvcJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorldJson > WebApiJson.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorldJson > WcfJson.txt The results from this test run are a bit interesting in that the WebAPI test improved performance significantly over returning plain string content. Here are the results:   The performance for each technology drops a little bit except for WebAPI which is up quite a bit! From this test it appears that WebAPI is actually significantly better performing returning a JSON response, rather than a plain string response. Snag with Apache Benchmark and 'Length Failures' I ran into a little snag with Apache Benchmark, which was reporting failures for my Web API requests when serializing. As the graph shows performance improved significantly from with JSON results from 5580 to 6530 or so which is a 15% improvement (while all others slowed down by 3-8%). However, I was skeptical at first because the WebAPI test reports showed a bunch of errors on about 10% of the requests. Check out this report: Notice the Failed Request count. What the hey? Is WebAPI failing on roughly 10% of requests when sending JSON? Turns out: No it's not! But it took some sleuthing to figure out why it reports these failures. At first I thought that Web API was failing, and so to make sure I re-ran the test with Fiddler attached and runiisning the ab.exe test by using the -X switch: ab.exe -n100 -c10 -X localhost:8888 http://localhost/aspnetperf/api/HelloWorldJson which showed that indeed all requests where returning proper HTTP 200 results with full content. However ab.exe was reporting the errors. After some closer inspection it turned out that the dates varying in size altered the response length in dynamic output. For example: these two results: {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.841926-10:00"} {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.8519262-10:00"} are different in length for the number which results in 68 and 69 bytes respectively. The same URL produces different result lengths which is what ab.exe reports. I didn't notice at first bit the same is happening when running the ASHX handler with JSON.NET result since it uses the same serializer that varies the milliseconds. Moral: You can typically ignore Length failures in Apache Benchmark and when in doubt check the actual output with Fiddler. Note that the other failure values are accurate though. Another interesting Side Note: Perf drops over Time As I was running these tests repeatedly I was finding that performance steadily dropped from a startup peak to a 10-15% lower stable level. IOW, with Web API I'd start out with around 6500 req/sec and in subsequent runs it keeps dropping until it would stabalize somewhere around 5900 req/sec occasionally jumping lower. For these tests this is why I did the IIS RESET and warm up for individual tests. This is a little puzzling. Looking at Process Monitor while the test are running memory very quickly levels out as do handles and threads, on the first test run. Subsequent runs everything stays stable, but the performance starts going downwards. This applies to all the technologies - Handlers, Web Forms, MVC, Web API - curious to see if others test this and see similar results. Doing an IISRESET then resets everything and performance starts off at peak again… Summary As I stated at the outset, these were informal to satiate my curiosity not to prove that any technology is better or even faster than another. While there clearly are differences in performance the differences (other than WCF REST which was by far the slowest and the raw handler which was by far the highest) are relatively minor, so there is no need to feel that any one technology is a runaway standout in raw performance. Choosing a technology is about more than pure performance but also about the adequateness for the job and the easy of implementation. The strengths of each technology will make for any minor performance difference we see in these tests. However, to me it's important to get an occasional reality check and compare where new technologies are heading. Often times old stuff that's been optimized and designed for a time of less horse power can utterly blow the doors off newer tech and simple checks like this let you compare. Luckily we're seeing that much of the new stuff performs well even in V1.0 which is great. To me it was very interesting to see Web API perform relatively badly with plain string content, which originally led me to think that Web API might not be properly optimized just yet. For those that caught my Tweets late last week regarding WebAPI's slow responses was with String content which is in fact considerably slower. Luckily where it counts with serialized JSON and XML WebAPI actually performs better. But I do wonder what would make generic string content slower than serialized code? This stresses another point: Don't take a single test as the final gospel and don't extrapolate out from a single set of tests. Certainly Twitter can make you feel like a fool when you post something immediate that hasn't been fleshed out a little more <blush>. Egg on my face. As a result I ended up screwing around with this for a few hours today to compare different scenarios. Well worth the time… I hope you found this useful, if not for the results, maybe for the process of quickly testing a few requests for performance and charting out a comparison. Now onwards with more serious stuff… Resources Source Code on GitHub Apache HTTP Server Project (ab.exe is part of the binary distribution)© Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET  Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Scrum for Team Foundation Server 2010

    - by Martin Hinshelwood
    I will be presenting a session on “Scrum for TFS2010” not once, but twice! If you are going to be at the Aberdeen Partner Group meeting on 27th April, or DDD Scotland on 8th May then you may be able to catch my session. Credit: I want to give special thanks to Aaron Bjork from Microsoft who provided me with most of my material He is a Scrum and Power Point genius. Scrum for Team Foundation Server 2010 Synopsis Visual Studio ALM (formerly Visual Studio Team System (VSTS)) and Team Foundation Server (TFS) are the cornerstones of development on the Microsoft .NET platform. These are the best tools for a team to have successful projects and for the developers to have a focused and smooth software development process. For TFS 2010 Microsoft is heavily investing in Scrum and has already started moving some teams across to using it. Martin will not be going in depth with Scrum but you can find out more about Scrum by reading the Scrum Guide and you can even asses your Scrum knowledge by having a go at the Scrum Open Assessment. Come and see Martin Hinshelwood, Visual Studio ALM MVP and Solution Architect from SSW show you: How to successfully gather requirements with User stories How to plan a project using TFS 2010 and Scrum How to work with a product backlog in TFS 2010 The right way to plan a sprint with TFS 2010 Tracking your progress The right way to use work items What you can use from the built in reporting as well as the Project portals available on from the SharePoint dashboard The important reports to give your Product Owner / Project Manager Walk away knowing how to see the project health and progress. Visual Studio ALM is designed to help address many of these traditional problems faced by teams. It does so by providing a set of integrated tools to help teams improve their software development activities and to help managers better support the software development processes. During this session we will cover the lifecycle of creating work items and how this fits into Scrum using Visual Studio ALM and Team Foundation Server. If you want to know more about how to do Scrum with TFS then there is a new course that has been created in collaboration with Microsoft and Scrum.org that is going to be the official course for working with TFS 2010. SSW has Professional Scrum Developer Trainers who specialise in training your developers in implementing Scrum with Microsoft's Visual Studio ALM tools. Ken Schwaber and and Sam Guckenheimer: Professional Scrum Development Technorati Tags: Scrum,VS ALM,VS 2010,TFS 2010

    Read the article

  • Scrum for Team Foundation Server 2010

    - by Martin Hinshelwood
    I will be presenting a session on “Scrum for TFS2010” not once, but twice! If you are going to be at the Aberdeen Partner Group meeting on 27th April, or DDD Scotland on 8th May then you may be able to catch my session. Credit: I want to give special thanks to Aaron Bjork from Microsoft who provided me with most of my material He is a Scrum and Power Point genius. Updated 9th May 2010 – I have now presented at both of these sessions  and posted about it. Scrum for Team Foundation Server 2010 Synopsis Visual Studio ALM (formerly Visual Studio Team System (VSTS)) and Team Foundation Server (TFS) are the cornerstones of development on the Microsoft .NET platform. These are the best tools for a team to have successful projects and for the developers to have a focused and smooth software development process. For TFS 2010 Microsoft is heavily investing in Scrum and has already started moving some teams across to using it. Martin will not be going in depth with Scrum but you can find out more about Scrum by reading the Scrum Guide and you can even asses your Scrum knowledge by having a go at the Scrum Open Assessment. You can also read SSW’s Rules to Better Scrum using TFS which have been developed during our own Scrum implementations. Come and see Martin Hinshelwood, Visual Studio ALM MVP and Solution Architect from SSW show you: How to successfully gather requirements with User stories How to plan a project using TFS 2010 and Scrum How to work with a product backlog in TFS 2010 The right way to plan a sprint with TFS 2010 Tracking your progress The right way to use work items What you can use from the built in reporting as well as the Project portals available on from the SharePoint dashboard The important reports to give your Product Owner / Project Manager Walk away knowing how to see the project health and progress. Visual Studio ALM is designed to help address many of these traditional problems faced by teams. It does so by providing a set of integrated tools to help teams improve their software development activities and to help managers better support the software development processes. During this session we will cover the lifecycle of creating work items and how this fits into Scrum using Visual Studio ALM and Team Foundation Server. If you want to know more about how to do Scrum with TFS then there is a new course that has been created in collaboration with Microsoft and Scrum.org that is going to be the official course for working with TFS 2010. SSW has Professional Scrum Developer Trainers who specialise in training your developers in implementing Scrum with Microsoft's Visual Studio ALM tools. Ken Schwaber and and Sam Guckenheimer: Professional Scrum Development Technorati Tags: Scrum,VS ALM,VS 2010,TFS 2010

    Read the article

  • From Trailer to Cloud: Skire acquisition expands Oracle’s on-demand project management options.

    - by Melissa Centurio Lopes
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} By Alison Weiss Whether building petrochemical facilities in the Middle East or managing mining operations in Australia, project managers face significant challenges. Local regulations and currencies, contingent labor, hybrid public/private funding sources, and more threaten project budgets and schedules. According to Mike Sicilia, senior vice president and general manager for the Oracle Primavera Global Business Unit, there will be trillions of dollars invested in industrial projects around the globe between 2012 and 2016. But even with so much at stake, project leads don’t always have time to look for new and better enterprise project portfolio management (EPPM) software solutions to manage large-scale capital initiatives across the enterprise. Oracle’s recent acquisition of Skire, a leading provider of capital program management and facilities management applications available both in the cloud and on premises, gives customers outstanding new EPPM options. By combining Skire’s cloud-based solutions for managing capital projects, real estate, and facilities with Oracle’s Primavera EPPM solutions, project managers can quickly get a solution running that is interoperable across an extended enterprise. Staff can access the EPPM solution within days, rather than waiting for corporate IT to put technology in place. “Staff can access the EPPM solution within days, rather than waiting for corporate IT to put technology in place,” says Sicilia. This applies to a problem that has, according to Sicilia, bedeviled project managers for decades: extending EPPM functionality into the field. Frequently, large-scale projects are remotely located, and the lack of communications and IT infrastructure threatened the accuracy of project reporting and scheduling. Read the full version of this article in the November 2012 edition of Oracle's Profit Magazine: Special Report on Project Management

    Read the article

  • Guidance and Pricing for MSDN 2010

    - by John Alexander
    Sorry for the rather lengthy post here. I get asked this all the time so I decided to post it…Visual Studio 2010 editions will be available on April 12, 2010. Product Features Professional with MSDN Essentials Professional with MSDN Premium with MSDN Ultimate with MSDN Test Professional with MSDN Debugging and Diagnostics IntelliTrace (Historical Debugger)         Static Code Analysis       Code Metrics       Profiling       Debugger   Testing Tools Unit Testing   Code Coverage       Test Impact Analysis       Coded UI Test       Web Performance Testing         Load Testing1         Microsoft Test Manager 2010       Test Case Management2       Manual Test Execution       Fast-Forward for Manual Testing       Lab Management Configuration3       Integrated Development Environment Multiple Monitor Support   Multi-Targeting   One Click Web Deployment   JavaScript and jQuery Support   Extensible WPF-Based Environment Database Development Database Deployment       Database Change Management2       Database Unit Testing       Database Test Data Generation       Data Access   Development Platform Support Windows Development   Web Development   Office and SharePoint Development   Cloud Development   Customizable Development Experience   Architecture and Modeling Architecture Explorer         UML® 2.0 Compliant Diagrams (Activity, Use Case, Sequence, Class, Component)         Layer Diagram and Dependency Validation         Read-only diagrams (UML, Layer, DGML Graphs)         Lab Management Virtual environment setup & tear down3       Provision environment from template3       Checkpoint environment3       Team Foundation Server Version Control2   Work Item Tracking2   Build Automation2   Team Portal2   Reporting & Business Intelligence2   Agile Planning Workbook2   Microsoft Visual Studio Team Explorer 2010   Test Case Management2       MSDN Subscription – Software and Services for Production Use Windows Azure Platform 20 hrs/mo † 50 hrs/mo † 100 hrs/mo † 250 hrs/mo † n/a Microsoft Visual Studio Team Foundation Server 2010   Microsoft Visual Studio Team Foundation Server 2010 CAL   1 1 1 1 Microsoft Expression Studio 3       Microsoft Office Professional Plus 2010, Project Professional 2010, Visio Premium 2010 (following Office 2010 launch)       MSDN Subscription – Software for Development and Testing 4 Windows 7, Windows Server 2008 R2 and SQL Server 2008 Toolkits, Software Development Kits, Driver Development Kits Previous versions of Windows (client and server operation systems)   Previous versions of Microsoft SQL Server   Microsoft Office       Microsoft Dynamics       All other Servers       Windows Embedded operating systems       Teamprise         MSDN Subscription – Other Benefits Technical support incidents 0 2 4 4 2 Priority support in MSDN Forums Microsoft e-learning collections (typically 10 courses or 20 hours) 0 1 2 2 1 MSDN Flash newsletter MSDN Online Concierge MSDN Magazine   System Requirements View View View View View Buy from (MSRP) $799 $1,199 $5,469 $11,899 $2,169 Renew from (MSRP) $549 (upgrade) $799 $2,299 $3,799 $899 † Availability varies by country and subscription level.  Details available on the MSDN site 1. May require one or more Microsoft Visual Studio Load Test Virtual User Pack 2010 2. Requires Team Foundation Server and a Team Foundation Server CAL 3. Requires Microsoft Visual Studio Lab Management 2010 4. Per-user license allows unlimited installations and use for designing, developing, testing, and demonstrating applications. UML is a registered trademark of Object Management Group, Inc. Windows is either a registered trademark or trademark of Microsoft Corporation in the United States and/or other countries.

    Read the article

  • Customization: It’s Wanted in Enterprise Tech Platforms Too

    - by Mike Stiles
    Did you know that every customer service person does their job the exact same way in every business organization?  And did you know that every business organization cares about the exact same metrics? I hope not, because both those things couldn’t be farther from the truth. And if there are different needs and approaches in different enterprises, it stands to reason technology platforms must become increasingly customizable. Oracle Social Cloud sees that coming and is doing something about it, at least in terms of social media management. Today we introduce Social Station, a customizable user experience workspace within the Oracle Social Relationship Management (SRM) platform. We think a lot about customer-centricity and customer experience around here, and we know our own customers are ready to start moving forward in being able to set up their work environments in the ways that work best for them. That kind of thing increases productivity, helps deliver on social objectives faster, and generally just makes life more pleasant. A recent IDG Enterprise report says that enterprises currently investing in more consumerized, easy-to-use technologies experience a 56% increase in employee productivity and a 46% increase in customer satisfaction. Imagine that. When you make it easier and more pleasant for employees to help customers, more customers get helped and everyone ends up happier. So what does this Social Station do and what does it mean, exactly? It’s an innovative move to take some pretty high-end tech (take a bow developers) and simplify it, making things more intuitive: Drag and drop lets you easily build out and personalize your social workspace with different modules. The new Custom Analytics module can mix and match over 120 metrics with thousands of customizable reporting options. You can check constantly refreshed updates and keep a real-time eye on the numbers you’re trying to move. One-click sharing and annotation in the Custom Analytics module improves sharing and collaboration across teams, departments and executives. Multi-view layout helps you leverage social insights by letting you monitor conversations by network, stream, metric, graph type, date range, and relative time period. The Enhanced Calendar is a better visual representation of content, posts, networks and views, letting you easily toggle between functions and views. The Oracle Social Station sets us up to always be developing & launching additional social modules for you, covering areas like content curation, influencer engagement, and command center creation. Oracle Social Cloud Group VP Meg Bear says, “Consumers today have high expectations of their technology application capabilities and usability, and those expectations don’t stop when they enter their workplaces.” In other words, internal enterprise technology platforms must reflect the personalization and customization being called for in consumer products and marketing. “One size fits all” is becoming an endangered concept. @mikestiles @oraclesocial

    Read the article

  • Configure 27" 2560x1440 for a monitor with corrupt EDID

    - by Aras
    I am trying to get a monitor work with my Ubuntu laptop. The monitor is this cheap 27" Korean monitors which has a 2560x1440 resolution -- and nothing else. Here are some specifications of this monitor: 2560x1440 @60Hz Only one dual link DVI-D input -- no other input port (no HDMI or display port) no OSD no scalar reports corrupt EDID does 2560x1440 @60Hz, did I say that already? Anyways, the monitor works beautifully with my Ubuntu desktop which has an nVidia card with DVI output. However, I am having problem using this monitor with my laptop. After some searching around I found a few posts suggesting to use an active adaptor for mini display port, so I went and bought a mini display to dual link DVI-D adaptor.. When using this adaptor the monitor is recognized by nvidia-settings tool but with incorrect resolution information. As you can see the monitor is incorrectly recognized and there are no other resolution available to set. This post on ubuntu forums and this other post on overclock both suggest that the monitor is reporting corrupt EDID file. I have tried following their instructions, but so far I have not been able to display any image on the monitor from my laptop. The laptop I am using is an ASUS G75VW with a 1920x1080 screen. It has a VGA, an HDMI 1.4a, and a mini display port. The graphic card is an nvidia gforce gtx 660M with 2GB dedicated memory. I am running Ubuntu 12.10 on here which I upgrade from 12.04 a few weeks ago. As I said I have tried several suggestions, including specifying Modeline in xorg.conf and also linking to EDID files I found from those forum posts above. However, I am not sure if the EDID files I found are suitable for my monitor. I think the solution to my problem consist of obtaining the EDID file of my monitor and then fixing it and modifying xorg.conf to force nvidia driver to load the correct resolution. However, I am not sure what steps I need to take to do this. Here is the part of sudo xrandr --prop output that is related to this monitor: DP-1 connected 800x600+1920+0 (normal left inverted right x axis y axis) 0mm x 0mm SignalFormat: DisplayPort supported: DisplayPort ConnectorType: DisplayPort ConnectorNumber: 3 (0x00000003) _ConnectorLocation: 3 (0x00000003) 800x600 60.3*+ I was expecting to see the EDID file in this output as was mentioned in this post, but it is not there. After several hours of tweaking X configurations, I decided it was time to ask for help here. I would really appreciate if someone with experience regarding EDID and X configuration could give me a hand to solve this issue.

    Read the article

  • Have you really fixed that problem?

    - by DavidWimbush
    The day before yesterday I saw our main live server's CPU go up to constantly 100% with just the occasional short drop to a lower level. The exact opposite of what you'd want to see. We're log shipping every 15 minutes and part of that involves calling WinRAR to compress the log backups before copying them over. (We're on SQL2005 so there's no native compression and we have bandwidth issues with the connection to our remote site.) I realised the log shipping jobs were taking about 10 minutes and that most of that was spent shipping a 'live' reporting database that is completely rebuilt every 20 minutes. (I'm just trying to keep this stuff alive until I can improve it.) We can rebuild this database in minutes if we have to fail over so I disabled log shipping of that database. The log shipping went down to less than 2 minutes and I went off to the SQL Social evening in London feeling quite pleased with myself. It was a great evening - fun, educational and thought-provoking. Thanks to Simon Sabin & co for laying that on, and thanks too to the guests for making the effort when they must have been pretty worn out after doing DevWeek all day first. The next morning I came down to earth with a bump: CPU still at 100%. WTF? I looked in the activity monitor but it was confusing because some sessions have been running for a long time so it's not a good guide what's using the CPU now. I tried the standard reports showing queries by CPU (average and total) but they only show the top 10 so they just show my big overnight archiving and data cleaning stuff. But the Profiler showed it was four queries used by our new website usage tracking system. Four simple indexes later the CPU was back where it should be: about 20% with occasional short spikes. So the moral is: even when you're convinced you've found the cause and fixed the problem, you HAVE to go back and confirm that the problem has gone. And, yes, I have checked the CPU again today and it's still looking sweet.

    Read the article

  • OBIEE 11.1.1 - How to configure HTTP compression / caching on Oracle BI Mobile app

    - by Ahmed Awan
     Applies to: OBIEE 11.1.1.5 Supported Physical Devices and OS: The Oracle BI Mobile application with HTTP compression / caching configurations is tested on following devices: iPhone 4S, 4, 3GS. iPad 2 and 1. Note these devices must be running the latest version of the iOS version, i.e. iOS 4.2.1 / iOS 5 is also supported. Configuring Pre-requisites: Prior to configuration, the Oracle Web tier software must be installed on server, as described in product documentation i.e. Enterprise Deployment Guide for Oracle Business Intelligence in Section 3.2, "Installing Oracle HTTP Server." The steps for configuring the compression and caching on Oracle HTTP Server are described in this PA blog at http://blogs.oracle.com/pa/entry/obiee_11g_user_interface_ui and in support Doc ID 1312299.1. Configuration Steps in Oracle BI Mobile application: 1. Download the BI Mobile app from the Apple iTunes App Store. The link is http://itunes.apple.com/us/app/oracle-business-intelligence/id434559909?mt=8 . 2. Add Server for example http://pew801.us.oracle.com:7777/analytics/ , here is how your “Server Setting” screen should look like on your OBI Mobile app:                                 Performance Gain Test (using Oracle® HTTP Server with OBIEE) The test with/without HTTP compression / caching was conducted on iPhone 4S / iPad 2 to measure the throughput (i.e. total bytes received) for Oracle® Business Intelligence Enterprise Edition. Below table shows the throughput comparison before and after using HTTP compression / caching for SampleApp using “QuickStart” dashboard accessing reports i.e. Overview, Details, Published Reporting and Scorecard. Testing shows that total bytes received were reduced from 2.3 MB to 723 KB. a. Test Results > Without HTTP Compression / Caching setting - Total Throughput (in Bytes) captured below: Total Bytes Statistics:        b. Test Results > With HTTP Compression / Caching settings - Total Throughput (in Bytes) captured below: Total Bytes Statistics:      

    Read the article

  • Willy Rotstein on Analytics and Social Media in Retail

    - by sarah.taylor(at)oracle.com
    Recently I came across a presentation from Dan Zarrella on "The Science of Retweets. (http://www.slideshare.net/HubSpot/the-science-of-retweets-with-dan-zarrella). It is an insightful, fact-based analysis of how tweets propagate and what makes them successful. The analysis is of course very interesting for those of us interested Tweeting. However, what really caught my attention is how well it illustrates, form a very different angle, some of the issues I am discussing with retailers these days. In particular the opportunities that e-commerce and social media open to those retailers with the appetite and vision to tackle the associated analytical challenges. And these challenges are of course not straightforward.   In his presentation Dan introduces the concept of Observability, I haven't had the opportunity to discuss with Dan his specific definition for the term. However, in practical retail terms, I would say that it means that through social media (and other web channels such as search) we can analyze and track processes by measuring Indicators that were not measurable before. The focus is in identifying patterns across a large number of consumers rather than what a particular individual "Likes".   The potential impact for retailers is huge. It opens the opportunity to monitor changes in consumer preference  and plan the business accordingly. And you can do this almost "real time" rather than through infrequent surveys that provide a "rear view" picture of your consumer behaviour. For instance, you could envision identifying when a particular set of fashion styles are breaking out from the pack, and commit a re-buy. Or you could monitor when the preference for a specific mobile device has declined and hence markdowns should be considered; or how demand for a specific ready-made food typically flows across regions and manage the inventory accordingly. Search, blogging, website and store data may need to be considered in identifying these trends. The data volumes involved are huge (check Andrea Morgan's recent post on "Big Data" in retail) but so are the benefits. As Andrea says, for the first time we can start getting insight into "Why" the business is performing in a certain way rather than just reporting on what is happening. And it is not just about the data volumes. Tackling the challenge also calls for integrated planning systems that can bring data and insight into the context of the Decision Making process Buyers, Merchandisers and Supply Chain managers are following. I strongly believe that only when data and process come together you can move from the anecdotal to systematically improving business performance.   I would love to hear your opinions on these trends and where you think Retail is heading to exploit these topics - please email me: [email protected]

    Read the article

  • Oracle Customer Hub - Directions, Roadmap and Customer Success

    - by Mala Narasimharajan
     By Gurinder Bahl With less than a week from OOW 2012, I would like to introduce you all to the core Oracle Customer MDM Strategy sessions. Fragmentation of customer data across disparate systems prohibits companies from achieving a complete and accurate view of their customers. Oracle Customer Hub provide a comprehensive set of services, utilities and applications to create and maintain a trusted master customer system of record across the enterprise. Customer Hub centralizes customer data from disparate systems across your enterprise into a master repository. Existing systems are integrated in real-time or via batch with the Hub, allowing you to leverage legacy platform investments while capitalizing on the benefits of a single customer identity. Don’t miss out on two sessions geared towards Oracle Customer Hub:   1) Attend session CON9747 - Turn Customer Data into an Enterprise Asset with Oracle Fusion Customer Hub Applications at Oracle Open World 2012 on Monday, Oct 1st, 10:45 AM - 11:45 AM @ Moscone West – 2008. Manouj Tahiliani, Sr. Director MDM Product Management will provide insight into the vision of Oracle Fusion Customer Hub solutions, and review the roadmap. You will discover how Fusion Customer MDM can help your enterprise improve data quality, create accurate and complete customer information,  manage governance and help create great customer experiences. You will also understand how to leverage data quality capabilities and create a sophisticated customer foundation within Oracle Fusion Applications. You will also hear Danette Patterson, Group Lead, Church Pension Group talk about how Oracle Fusion Customer Hub applications provide a modern, next-generation, multi-domain foundation for managing customer information in a private cloud. 2)  Don't miss session  CON9692 - Customer MDM is key to Strategic Business Success and Customer Experience Management at Oracle Open World 2012 on Wednesday, October 3rd 2012 from 3:30-4:30pm @ Westin San Francisco Metropolitan 1. JP Hurtado, Director, Customer Systems, will provide insight on how RCCL overcame challenges of data quality, guest recognition & centralized customer view to provide consolidated customer view to multiple reservation, CRM, marketing, service, sales, data warehouse and loyalty systems. You will learn how Royal Caribbean Cruise Lines (RCCL), which has over 30 million customer and maintain multiple brands, leveraged Oracle Customer Hub (Siebel UCM) as backbone to customer data management strategy for past 5 years. Gurinder Bahl from MDM Product Management will provide an update on Oracle Customer Hub strategy, what we have achieved since last Open World and our future plans for the Oracle Customer Hub. You will learn about Customer Hub Data Quality capabilities around data analysis, cleansing, matching, address validation as well as reporting and monitoring capabilities. The MDM track at Oracle Open World covers variety of topics related to MDM. In addition to the product management team presenting product updates and roadmap, we have several Customer Panels, and Conference sessions. You can see an overview of MDM sessions here.  Looking forward to see you at Open World, the perfect opportunity to learn about cutting edge Oracle technologies. 

    Read the article

  • BI Publisher - Hottest Show in Vegas

    - by mike.donohue
    Two days down, two to go. Monday was a very busy and rewarding day. Attended "XML Publisher and FSG for Beginners" given by Susan Behn and Alyssa Johnson from Solution Beacon. It was packed, standing room only ... even though it was at 8:00 am. Later in the afternoon, despite being at the same time and in conflict with other Publisher related sessions, Noelle's session, "The Reporting Platform for Applications: Oracle Business Intelligence Publisher" and my session, "Introduction to Oracle Business Intelligence Publisher" were both very well attended. Immediately following our presentations we ran the BI Publisher Hands On Lab which was great fun. The turnout was so large that unfortunately we could not accommodate everyone who came to the lab. There were as many as 5 people huddled around each of the 20 machines. All the the groups completed the 2 main exercises. Some groups even took the product for an off-road test drive. Look at all the fun we had ... For those who could not attend or want the Hands On Lab document: Hands On Lab Oracle BI Publisher Collaborate 2010.pdf Note that these lab instructions assume a specific set up and files that you may not have in your environment. You can download and install a trial license version of BI Publisher from the download page. Highly recommend taking a look at the additional Tutorials available on OTN. Big thanks to Dan Vlamis and Jonathan Clark from Vlamis Software Solutions and to the Oracle BIWA SIG for setting up these machines and getting the time and space to run this lab. It was inspiring to see all of the attendees successfully creating reports. On Tuesday morning we were up early again for a rousing session of BI Publisher Best Practices that was also, very well attended especially considering the 8 am start. Later that morning saw Ben Bruno from STR Software and two of his customers speak on the additional functionality and ROI they have achieved by using Publisher within EBS and AventX to FAX and Email Publisher generated documents. Spent the afternoon staffing the BI Technology demo pod and had a steady flow of people dropping by with questions. Having a great conference so far and looking forward to the rest of it.

    Read the article

  • NEC Corporation uPD720200 USB 3.0 controller doesn't run at full speed

    - by Radek Zyskowski
    I have fresh install of Ubuntu 10.10. I have external HD on USB 3.0. Trying to connect this via PCI Express NEC controller. dmesg: [ 8966.820078] usb 6-3: new high speed USB device using xhci_hcd and address 0 [ 8966.839831] xhci_hcd 0000:02:00.0: WARN: short transfer on control ep [ 8966.840580] xhci_hcd 0000:02:00.0: WARN: short transfer on control ep [ 8966.841329] xhci_hcd 0000:02:00.0: WARN: short transfer on control ep [ 8966.842079] xhci_hcd 0000:02:00.0: WARN: short transfer on control ep [ 8966.843343] scsi8 : usb-storage 6-3:1.0 [ 8967.847144] scsi 8:0:0:0: Direct-Access SAMSUNG HD204UI 1AQ1 PQ: 0 ANSI: 5 [ 8967.847589] sd 8:0:0:0: Attached scsi generic sg2 type 0 [ 8967.847923] sd 8:0:0:0: [sdb] 3907029168 512-byte logical blocks: (2.00 TB/1.81 TiB) [ 8967.848341] xhci_hcd 0000:02:00.0: WARN: Stalled endpoint [ 8967.850959] sd 8:0:0:0: [sdb] Write Protect is off [ 8967.850963] sd 8:0:0:0: [sdb] Mode Sense: 23 00 00 00 [ 8967.850966] sd 8:0:0:0: [sdb] Assuming drive cache: write through [ 8967.851818] xhci_hcd 0000:02:00.0: WARN: Stalled endpoint [ 8967.852365] sd 8:0:0:0: [sdb] Assuming drive cache: write through [ 8967.852370] sdb: sdb1 [ 8967.871315] xhci_hcd 0000:02:00.0: WARN: Stalled endpoint [ 8967.871853] sd 8:0:0:0: [sdb] Assuming drive cache: write through [ 8967.871856] sd 8:0:0:0: [sdb] Attached SCSI disk [ 8967.950728] xhci_hcd 0000:02:00.0: WARN: Stalled endpoint [ 8967.951355] sd 8:0:0:0: [sdb] Sense Key : Recovered Error [current] [descriptor] [ 8967.951361] Descriptor sense data with sense descriptors (in hex): [ 8967.951363] 72 01 04 1d 00 00 00 0e 09 0c 00 00 00 00 00 00 [ 8967.951375] 00 00 00 00 00 50 [ 8967.951380] sd 8:0:0:0: [sdb] ASC=0x4 ASCQ=0x1d [ 8968.790076] xhci_hcd 0000:02:00.0: HC died; cleaning up [ 8968.790076] usb 6-3: USB disconnect, address 2 [ 8999.008554] scsi 8:0:0:0: [sdb] Unhandled error code [ 8999.008558] scsi 8:0:0:0: [sdb] Result: hostbyte=DID_TIME_OUT driverbyte=DRIVER_OK [ 8999.008562] scsi 8:0:0:0: [sdb] CDB: Read(10): 28 00 74 70 97 39 00 00 3e 00 [ 8999.008573] end_request: I/O error, dev sdb, sector 1953535801 [ 8999.008578] Buffer I/O error on device sdb1, logical block 1953535738 [ 8999.008582] Buffer I/O error on device sdb1, logical block 1953535739 [ 8999.008585] Buffer I/O error on device sdb1, logical block 1953535740 [ 8999.008589] Buffer I/O error on device sdb1, logical block 1953535741 [ 8999.008592] Buffer I/O error on device sdb1, logical block 1953535742 [ 8999.008595] Buffer I/O error on device sdb1, logical block 1953535743 [ 8999.008600] Buffer I/O error on device sdb1, logical block 1953535744 [ 8999.008603] Buffer I/O error on device sdb1, logical block 1953535745 [ 8999.008606] Buffer I/O error on device sdb1, logical block 1953535746 [ 8999.008609] Buffer I/O error on device sdb1, logical block 1953535747 [ 8999.008642] scsi 8:0:0:0: rejecting I/O to offline device [ 8999.008747] scsi 8:0:0:0: [sdb] Unhandled error code [ 8999.008749] scsi 8:0:0:0: [sdb] Result: hostbyte=DID_NO_CONNECT driverbyte=DRIVER_OK [ 8999.008752] scsi 8:0:0:0: [sdb] CDB: Read(10): 28 00 74 70 97 77 00 00 3e 00 [ 8999.008760] end_request: I/O error, dev sdb, sector 1953535863 sudo lspci -v 2:00.0 USB Controller: NEC Corporation uPD720200 USB 3.0 Host Controller (rev 03) (prog-if 30) Physical Slot: 32 Flags: bus master, fast devsel, latency 0, IRQ 16 Memory at fe9fe000 (64-bit, non-prefetchable) [size=8K] Capabilities: [50] Power Management version 3 Capabilities: [70] MSI: Enable- Count=1/8 Maskable- 64bit+ Capabilities: [90] MSI-X: Enable- Count=8 Masked- Capabilities: [a0] Express Endpoint, MSI 00 Capabilities: [100] Advanced Error Reporting Capabilities: [140] Device Serial Number ff-ff-ff-ff-ff-ff-ff-ff Capabilities: [150] #18 Kernel driver in use: xhci_hcd Kernel modules: xhci-hcd If I try to put into this controller any USB 2.0, it works fine. But USB 3.0 nope. Any idea?

    Read the article

  • Some of my favourite Visual Studio 2012 things&ndash;Teams

    - by Aaron Kowall
    Getting the balance right for when and how many team projects to create has always been a bit of a balance.  On large initiatives, there are often teams who work toward a common system.  These teams often have quite a bit of autonomy, but need to roll up to some higher level initiative.  In TFS 2010, people were often tempted to create separate Team Projects for each of the sub-teams and then do some magic with reporting and cross-team queries to get the consolidated view.  My recommendation was always to use Areas as a means of separating work across the team, but that always resulted in a large number of queries that need to be maintained and just seemed confusing.  When doing anything you had to remember to filter the query or view by Area in order to get correct results. Along with the awesome web access portal that comes in TFS 2012 (which I will cover details of in another post) the product group has introduced the concept of Teams.  A team is a sub-group within a TFS 2012 Team Project which allows us to more easily divide work along team boundaries. Technically, a Team is defined by an Area Path and a TFS Group, both of which could be done in TFS 2012.  However, by allowing for creation of a ‘Team’ in TFS 2012, the web portal is able to do a bunch of ‘magic’ for us.  We can view the project site (backlog, taskboard, etc) for the the team, we can assign items to the team and we can view the burndown for the team.  Basically, all the stuff that we had to prepare manually we now get created and managed for us with a nice UI. When you create a Team Project in TFS 2012, a ‘Default’ team is created with the same name as the Team Project.  So, if you only have 1 team working on the project, you are set.  If you want to divide the work into additional teams, you can create teams by using the Team Web Client. Teams are created using the ‘Administer Server’ icon in the top right of the web site.   You can select the team site by using the team chooser: Once you have selected a team, the Product Backlog, TaskBoard, Burndown Charts, etc. are all filtered to that team. NOTE: You always have the ability to choose the ‘Default’ team to see items for the entire project. PS: It’s been a long while since I shared on this blog.  To help with that I’m in a blogging challenge with some other developer and agilist friends.  Please check out their blogs as well: Steve Rogalsky: http://winnipegagilist.blogspot.ca Dylan Smith: http://www.geekswithblogs.net/optikal Tyler Doerkson: http://blog.tylerdoerksen.com David Alpert: http://www.spinthemoose.com Dave White: http://www.agileramblings.com   Technorati Tags: TFS 2012,Agile,Team

    Read the article

  • SQLAuthority News – SQL Server 2012 Upgrade Technical Guide – A Comprehensive Whitepaper – (454 pages – 9 MB)

    - by pinaldave
    Microsoft has just released SQL Server 2012 Upgrade Technical Guide. This guide is very comprehensive and covers the subject of upgrade in-depth. This is indeed a helpful detailed white paper. Even writing a summary of this white paper would take over 100 pages. This further proves that SQL Server 2012 is quite an important release from Microsoft. This white paper discusses how to upgrade from SQL Server 2008/R2 to SQL Server 2012. I love how it starts with the most interesting and basic discussion of upgrade strategies: 1) In-place upgrades, 2) Side by side upgrade, 3) One-server, and 4) Two-server. This whitepaper is not just pure theory but is also an excellent source for some tips and tricks. Here is an example of a good tip from the paper: “If you want to upgrade just one database from a legacy instance of SQL Server and not upgrade the other databases on the server, use the side-by-side upgrade method instead of the in-place method.” There are so many trivia, tips and tricks that make creating the list seems humanly impossible given a short period of time. My friend Vinod Kumar, an SQL Server expert, wrote a very interesting article on SQL Server 2012 Upgrade before. In that article, Vinod addressed the most interesting and practical questions related to upgrades. He started with the fundamentals of how to start backup before upgrade and ended with fail-safe strategies after the upgrade is over. He covered end-to-end concepts in his blog posts in simple words in extremely precise statements. A successful upgrade uses a cycle of: planning, document process, testing, refine process, testing, planning upgrade window, execution, verifying of upgrade and opening for business. If you are at Vinod’s blog post, I suggest you go all the way down and collect the gold mine of most important links. I have bookmarked the blog by blogging about it and I suggest that you bookmark it as well with the way you prefer. Vinod Kumar’s blog post on SQL Server 2012 Upgrade Technical Guide SQL Server 2012 Upgrade Technical Guide is a detailed resource that’s also available online for free. Each chapter was carefully crafted and explained in detail. Here is a quick list of the chapters included in the whitepaper. Before downloading the guide, beware of its size of 9 MB and 454 pages. Here’s the list of chapters: Chapter 1: Upgrade Planning and Deployment Chapter 2: Management Tools Chapter 3: Relational Databases Chapter 4: High Availability Chapter 5: Database Security Chapter 6: Full-Text Search Chapter 7: Service Broker Chapter 8: SQL Server Express Chapter 9: SQL Server Data Tools Chapter 10: Transact-SQL Queries Chapter 11: Spatial Data Chapter 12: XML and XQuery Chapter 13: CLR Chapter 14: SQL Server Management Objects Chapter 15: Business Intelligence Tools Chapter 16: Analysis Services Chapter 17: Integration Services Chapter 18: Reporting Services Chapter 19: Data Mining Chapter 20: Other Microsoft Applications and Platforms Appendix 1: Version and Edition Upgrade Paths Appendix 2: SQL Server 2012: Upgrade Planning Checklist Download SQL Server 2012 Upgrade Technical Guide [454 pages and 9 MB] Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, DBA, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • Oracle Enterprise Innovation Days

    - by Lara Ermacora
    Si è tenuto lo scorso 10 e 11 novembre l'appuntamento con l'innovazione marcato Oracle. L' Oracle Enterprise Innovation Days, alla sua seconda edizione, ha portato a Bologna tutte le aziende che pensano all'innovazione come leva principale per difendere e rafforzare la propria competitività. All'interno di un panorama, come quello odierno, complesso ed eterogeneo si è discusso a lungo di approcci strategici, soluzioni possibili e sono state portate d'esempio alcune esperienze significative. Fra gli ospiti dell'evento Rajan Krishnan, Vice President, Applications Product Development and Product Management for EMEA, ha presentato le strategie applicative di Oracle aprendo così la discussione sulla tematica principale della sessione plenaria: Oracle Fusion Applications. Il suo intervento è stato subito seguito da Enrico Pagliarini, giornalista del sole 24 ore che ha intervistato 3 diverse coppie Partner / Cliente per approfondire con loro i progetti altamente innovativi a cui le loro aziende hanno collaborato.  Si è parlato di Enel Servizi Srl che grazie ad Accenture ha portato la soluzione Syebel Energy CRM alla sua attuale versione 8.0 per una migliore gestione dei clienti all'interno del mercato libero caratterizzato dalla sua alta competitività; Prysmian che, a fronte dell'acquisizione della società olandese Draka, insieme a Reply, ha deciso di rimodellare il processo di Reporting Civilistico e Gestionale di gruppo, creando una nuova applicazione che soddisfi i requisiti della nuova organizzazione nascente; Kinexia e Waste Italia precedentemente parte del gruppo Unendo e ora divisesi l'una nel mercato dei rinnovabili l'altra in quello dello smaltimento rifiuti che con l'aiuto di Deloitte si sono dotate della soluzione full outsourcing JDE, a seguito di  una sw selection tra JDE, SAP e altre soluzioni italiane.Durante la cena altri due momenti hanno attirato l'attenzione dei partecipanti: la presentazione di Michele Stroligo, giovanissimo  Designer Team Member Oracle Racing e i Reference Customer Award ovvero le premiazioni dei clienti che si sono contraddistinti come migliori referenze nei diversi mercati con diversi prodotti. I premi sono stati assegnati a: FIAT, Enel, Boiron Laboratoires, Champion Europe, Mediaset, Coeclerici. Il pomeriggio ha interessato invece vari percorsi di approfondimento declinati sulle diverse figure professionali concludendosi con la presentazione del Tenente Colonello Marco Lant delle Frecce Tricolori, esempio di eccellezza italiana noto in tutto il mondo. La giornata si è conclusa con la cena di gala nel famoso palazzo Re Enzo che troneggia sulla piazza principale della città.  La mattinata del secondo giorno è stata interamente dedicata all'approfondimento degli argomenti di maggior interesse attraverso tavoli interattivi e workshop a cura dei partner Oracle. L'evento si è poi concluso con una serie di iniziative culturali dedicate ai congressisti. A breve sarà disponibile il sito dedicato all'evento con tutte le foto della giornata, i video degli interventi più salienti, potrete inoltre scaricare tutte le presentazioni fatte durante i lavori. Rimani aggiornato sull'Oracle Enterprise Innovation Days 2011 visitando il blog! Strategie Applicative di Oracle - Rajan Krishnan bologna nov 2011 View more presentations from Oracle Apps - Italia .

    Read the article

  • SQL SERVER – Inviting Ideas for SQL in Sixty Seconds – 12/12/12

    - by pinaldave
    Today is 12/12/12 – I am not sure when will I write this kind of date again – maybe never. This opportunity comes once in a lifetime when we have the same date, month and year all have same digit. December 12th is one of the most fantastic day in my personal life. Four years ago, this day I got married to my wife – Nupur Dave.  Here are photos of our wedding (Dec 12, 2008). Here is a very interesting photo of myself earlier this year. It is not photoshoped or modified photo. The only modification I have done here is to add arrow and speech bubble. Every Wednesday I tried to put one SQL in Sixty Seconds Video. The journey has been fantastic and so far I have put a total of 35 SQL in Sixty Seconds Video. The goal of the video is to learn something in 1 minute. In our daily life we are all very busy and hardly have time for anything. No matter how much we are busy – we all have one minute of time. Sometime we wait for a minute in elevators, at the escalator, at a coffee shop, or just waiting for our phone reboot. Today is a fantastic day – 12/12/12. Let me invite all of you submits SQL in Sixty Seconds idea. If I like your idea and create a sixty second video over it – you will win surprise learning material from me. There are two very simple rules of the contest: - I should have not have already recorded the tip. The tip should be descriptive. Do not just suggest to cover “Performance Tuning” or “How to Create Index” or “More of reporting services”. The tip should have around 100 words of description explaining SQL Tip. The contest is open forever. The winner will be announced whenever I use the tip to convert to video. If I use your tip, I will for sure mention in the blog post that it is inspired from your suggestion. Meanwhile, do not forget to subscribe YouTube Channel. Here are my latest three videos from SQL in Sixty Seconds. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: About Me, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • Antenna Aligner Part 10: Updates and emails…

    - by Chris George
    Since my last post back in July, I’ve not done huge amounts of work on my app for two reasons. Firstly, no time! Secondly, I wanted to leave it out in the wild for a while and see what happened. Well, what happened?  over 1,300 users, that’s what’s happened!  This uptake is beyond my wildest expectations, and apart from a couple of issues that I’ll mention in a minute, most of the feedback has been very positive indeed! I’ve had several emails giving me feedback and reporting issues, all of which I have made a point of replying to immediately. This act alone has met with favourable replies! One of the main issues was with iPad. So it turns out that my app is only accurate in portrait mode. Turning it into landscape will offset the direction by +-90degrees! Whoops! I think I’ve fixed this by disabling the orientation switching, but I have not yet had an iPad to test this on. I had several emails from iPod Touch users claiming the app did not work for them. Specifically, the compass view did not work. On investigation, it turns out that the iPod Touch does not have the compass hardware required to do this. Unfortunately there is no way to exclude iPod Touch’s from the list of supported devices, so I’ve just had to make it very clear in the itunes description that the device is not fully supported.  You can still get the list of transmitters, but you then have to use a real compass to get the bearing. But that’s not the end of the world. Several customers have requested the aerial polarisation to be displayed in the app. I was already working on this, and the data was already there, it was just a case of displaying this in the UI. I have a solution now, and this will be in the next release. Of course, with the Digital switchover in full swing across the UK, there have been one set of data updates (in 1.0.3), and another is due shortly. This reflects the transmitters as they switch over the digital fully and their power output increased. So all in all I’m very pleased with the feedback I’ve had, and I’m looking to get the next release out there by early December (allowing for the 2-3 week Apple approval lag!)  

    Read the article

  • Analysis Services Tabular books #ssas #tabular

    - by Marco Russo (SQLBI)
    Many people are looking for books about Analysis Services Tabular. Today there are two books available and they complement each other: Microsoft SQL Server 2012 Analysis Services: The BISM Tabular Model by Marco Russo, Alberto Ferrari and Chris Webb Applied Microsoft SQL Server 2012 Analysis Services: Tabular Modeling by Teo Lachev The book I wrote with Alberto and Chris is a complete guide to create tabular models and has a good coverage about DAX, including how to use it for enriching a semantic model with calculated columns and measures and how to use it for querying a Tabular model. In my experience, DAX as a query language is a very interesting option for custom analytical applications that requires a fast calculation engine, or simply for standard reports running in Reporting Services and accessing a Tabular model. You can freely preview the table of content and read some excerpts from the book on Safari Books Online. The book is in printing and should be shipped within mid-July, so finally it will be very soon on the shelf of all the people already preordered it! The Teo Lachev’s book, covers the full spectrum of Tabular models provided by Microsoft: starting with self-service BI, you have users creating a model with PowerPivot for Excel, publishing it to PowerPivot for SharePoint and exploring data by using Power View; then, the PowerPivot for Excel model can be imported in a Tabular model and published in Analysis Services, adding more control on the model through row-level security and partitioning, for example. Teo’s book follows a step-by-step approach describing each feature that is very good for a beginner that is new to PowerPivot and/or to BISM Tabular. If you need to get the big picture and to start using the products that are part of the new Microsoft wave of BI products, the Teo’s book is for you. After you read the book from Teo, or if you already have a certain confidence with PowerPivot or BISM Tabular and you want to go deeper about internals, best practices, design patterns in just BISM Tabular, then our book is a suggested read: it contains several chapters about DAX, includes discussions about new opportunities in data model design offered by Tabular models, and also provides examples of optimizations you can obtain in DAX and best practices in data modeling and queries. It might seem strange that an author write a review of a book that might seem to compete with his one, but in reality these two books complement each other and are not alternatives. If you have any doubt, buy both: you will be not disappointed! Moreover, Amazon usually offers you a deal to buy three books, including the Visualizing Data with Microsoft Power View, another good choice for getting all the details about Power View.

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >