Search Results

Search found 41071 results on 1643 pages for 'post update'.

Page 484/1643 | < Previous Page | 480 481 482 483 484 485 486 487 488 489 490 491  | Next Page >

  • IDC Analyst Mike Fauscette Writes About Oracle And The Cloud

    - by Roxana Babiciu
    "It's becoming clear that cloud is now a core part of Oracle's strategy," says analyst Michael Fauscette in his post-OpenWorld article in Seeking Alpha. He believes we have a well-rounded portfolio "with a cloud platform/infrastructure, a broad selection of apps, and a partner marketplace." From his numerous conversations with customers, he highlights their continual interest in hybrid deployments and also in shifting apps to the cloud. Read more.

    Read the article

  • IDC Analyst Mike Fauscette Writes About Oracle and The Cloud

    - by Cinzia Mascanzoni
    "It's becoming clear that cloud is now a core part of Oracle's strategy," says analyst Michael Fauscette in his post-OpenWorld article in Seeking Alpha. He believes we have a well-rounded portfolio "with a cloud platform/infrastructure, a broad selection of apps, and a partner marketplace." From his numerous conversations with customers, he highlights their continual interest in hybrid deployments and also in shifting apps to the cloud. Read more.

    Read the article

  • How to downgrade Thunderbird 12.04 to Thunderbird 11.01.1 on Ubuntu 10.04.4 LTS

    - by John
    On several Ubuntu 10.04.4 LTS systems I've recently upgraded to Thunderbird 11.01.1 to 12.04 via "apt-get update/upgrade". Now my T-Bird menu's drop off un-expectantly; trying to click on "Get Mail" sometimes fails intermittently; and printing is also intermittent. I did not have these intermittent problems before upgrading. How do I revert Thunderbird to the previous release until the developers fix these issues?

    Read the article

  • Creating A SharePoint Parent/Child List Relationship&ndash; SharePoint 2010 Edition

    - by Mark Rackley
    Hey blog readers… It has been almost 2 years since I posted my most read blog on creating a Parent/Child list relationship in SharePoint 2007: Creating a SharePoint List Parent / Child Relationship - Out of the Box And then a year ago I improved on my method and redid the blog post… still for SharePoint 2007: Creating a SharePoint List Parent/Child Relationship – VIDEO REMIX Since then many of you have been asking me how to get this to work in SharePoint 2010, and frankly I have just not had time to look into it. I wish I could have jumped into this sooner, but have just recently began to look at it. Well.. after all this time I have actually come up with two solutions that work, neither of them are as clean as I’d like them to be, but I wanted to get something in your hands that you can start using today. Hopefully in the coming weeks and months I’ll be able to improve upon this further and give you guys some better options. For the most part, the process is identical to the 2007 process, but you have probably found out that the list view web parts in 2010 behave differently, and getting the Parent ID to your new child form can be a pain in the rear (at least that’s what I’ve discovered). Anyway, like I said, I have found a couple of solutions that work. If you know of a better one, please let us know as it bugs me that this not as eloquent as my 2007 implementation. Getting on the same page First thing I’d recommend is recreating this blog: Creating a SharePoint List Parent/Child Relationship – VIDEO REMIX in SharePoint 2010… There are some vague differences, but it’s basically the same…  Here’s a quick video of me doing this in SP 2010: Creating Lists necessary for this blog post Now that you have the lists created, lets set up the New Time form to use a QueryString variable to populate the Parent ID field: Creating parameters in Child’s new item form to set parent ID Did I talk fast enough through both of those videos? Hopefully by now that stuff is old hat to you, but I wanted to make sure everyone could get on the same page.  Okay… let’s get started. Solution 1 – XSLTListView with Javascript This solution is the more elegant of the two, however it does require the use of a little javascript.  The other solution does not use javascript, but it also doesn’t use the pretty new SP 2010 pop-ups.  I’ll let you decide which you like better. The basic steps of this solution are: Inserted a Related Item View Insert a ContentEditorWebPart Insert script in ContentEditorWebPart that pulls the ID from the Query string and calls the method to insert a new item on the child entry form Hide the toolbar from data view to remove “add new item” link. Again, you don’t HAVE to use a CEWP, you could just put the javascript directly in the page using SPD.  Anyway, here is how I did it: Using Related Item View / JavaScript Here’s the JavaScript I used in my Content Editor Web Part: <script type="text/javascript"> function NewTime() { // Get the Query String values and split them out into the vals array var vals = new Object(); var qs = location.search.substring(1, location.search.length); var args = qs.split("&"); for (var i=0; i < args.length; i++) { var nameVal = args[i].split("="); var temp = unescape(nameVal[1]).split('+'); nameVal[1] = temp.join(' '); vals[nameVal[0]] = nameVal[1]; } var issueID = vals["ID"]; //use this to bring up the pretty pop up NewItem2(event,"http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID=" + issueID); //use this to open a new window //window.location="http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID=" + issueID; } </script> Solution 2 – DataFormWebPart and exact same 2007 Process This solution is a little more of a hack, but it also MUCH more close to the process we did in SP 2007. So, if you don’t mind not having the pretty pop-up and prefer the comforts of what you are used to, you can give this one a try.  The basics steps are: Insert a DataFormWebPart instead of the List Data View Create a Parameter on DataFormWebPart to store “ID” Query String Variable Filter DataFormWebPart using Parameter Insert a link at bottom of DataForm Web part that points to the Child’s new item form and passes in the Parent Id using the Parameter. See.. like I told you, exact same process as in 2007 (except using the DataFormWeb Part). The DataFormWebPart also requires a lot more work to make it look “pretty” but it’s just table rows and cells, and can be configured pretty painlessly.  Here is that video: Using DataForm Web Part One quick update… if you change the link in this solution from: <tr> <td><a href="http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID={$IssueIDParam}">Click here to create new item...</a> </td> </tr> to: <tr> <td> <a href="javascript:NewItem2(event,'http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID={$IssueIDParam}');">Click here to create new item...</a> </td> </tr> It will open up in the pretty pop up and act the same as solution one… So… both Solutions will now behave the same to the end user. Just depends on which you want to implement. That’s all for now… Remember in both solutions when you have them working, you can make the “IssueID” invisible to users by using the “ms-hidden” class (it’s my previous blog post on the subject up there). That’s basically all there is to it! No pithy or witty closing this time… I am sorry it took me so long to dive into this and I hope your questions are answered. As I become more polished myself I will try to come up with a cleaner solution that will make everyone happy… As always, thanks for taking the time to stop by.

    Read the article

  • ASP.NET Frameworks and Raw Throughput Performance

    - by Rick Strahl
    A few days ago I had a curious thought: With all these different technologies that the ASP.NET stack has to offer, what's the most efficient technology overall to return data for a server request? When I started this it was mere curiosity rather than a real practical need or result. Different tools are used for different problems and so performance differences are to be expected. But still I was curious to see how the various technologies performed relative to each just for raw throughput of the request getting to the endpoint and back out to the client with as little processing in the actual endpoint logic as possible (aka Hello World!). I want to clarify that this is merely an informal test for my own curiosity and I'm sharing the results and process here because I thought it was interesting. It's been a long while since I've done any sort of perf testing on ASP.NET, mainly because I've not had extremely heavy load requirements and because overall ASP.NET performs very well even for fairly high loads so that often it's not that critical to test load performance. This post is not meant to make a point  or even come to a conclusion which tech is better, but just to act as a reference to help understand some of the differences in perf and give a starting point to play around with this yourself. I've included the code for this simple project, so you can play with it and maybe add a few additional tests for different things if you like. Source Code on GitHub I looked at this data for these technologies: ASP.NET Web API ASP.NET MVC WebForms ASP.NET WebPages ASMX AJAX Services  (couldn't get AJAX/JSON to run on IIS8 ) WCF Rest Raw ASP.NET HttpHandlers It's quite a mixed bag, of course and the technologies target different types of development. What started out as mere curiosity turned into a bit of a head scratcher as the results were sometimes surprising. What I describe here is more to satisfy my curiosity more than anything and I thought it interesting enough to discuss on the blog :-) First test: Raw Throughput The first thing I did is test raw throughput for the various technologies. This is the least practical test of course since you're unlikely to ever create the equivalent of a 'Hello World' request in a real life application. The idea here is to measure how much time a 'NOP' request takes to return data to the client. So for this request I create the simplest Hello World request that I could come up for each tech. Http Handler The first is the lowest level approach which is an HTTP handler. public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public bool IsReusable { get { return true; } } } WebForms Next I added a couple of ASPX pages - one using CodeBehind and one using only a markup page. The CodeBehind page simple does this in CodeBehind without any markup in the ASPX page: public partial class HelloWorld_CodeBehind : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { Response.Write("Hello World. Time is: " + DateTime.Now.ToString() ); Response.End(); } } while the Markup page only contains some static output via an expression:<%@ Page Language="C#" AutoEventWireup="false" CodeBehind="HelloWorld_Markup.aspx.cs" Inherits="AspNetFrameworksPerformance.HelloWorld_Markup" %> Hello World. Time is <%= DateTime.Now %> ASP.NET WebPages WebPages is the freestanding Razor implementation of ASP.NET. Here's the simple HelloWorld.cshtml page:Hello World @DateTime.Now WCF REST WCF REST was the token REST implementation for ASP.NET before WebAPI and the inbetween step from ASP.NET AJAX. I'd like to forget that this technology was ever considered for production use, but I'll include it here. Here's an OperationContract class: [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World" + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } } WCF REST can return arbitrary results by returning a Stream object and a content type. The code above turns the string result into a stream and returns that back to the client. ASP.NET AJAX (ASMX Services) I also wanted to test ASP.NET AJAX services because prior to WebAPI this is probably still the most widely used AJAX technology for the ASP.NET stack today. Unfortunately I was completely unable to get this running on my Windows 8 machine. Visual Studio 2012  removed adding of ASP.NET AJAX services, and when I tried to manually add the service and configure the script handler references it simply did not work - I always got a SOAP response for GET and POST operations. No matter what I tried I always ended up getting XML results even when explicitly adding the ScriptHandler. So, I didn't test this (but the code is there - you might be able to test this on a Windows 7 box). ASP.NET MVC Next up is probably the most popular ASP.NET technology at the moment: MVC. Here's the small controller: public class MvcPerformanceController : Controller { public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } } ASP.NET WebAPI Next up is WebAPI which looks kind of similar to MVC. Except here I have to use a StringContent result to return the response: public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } } Testing Take a minute to think about each of the technologies… and take a guess which you think is most efficient in raw throughput. The fastest should be pretty obvious, but the others - maybe not so much. The testing I did is pretty informal since it was mainly to satisfy my curiosity - here's how I did this: I used Apache Bench (ab.exe) from a full Apache HTTP installation to run and log the test results of hitting the server. ab.exe is a small executable that lets you hit a URL repeatedly and provides counter information about the number of requests, requests per second etc. ab.exe and the batch file are located in the \LoadTests folder of the project. An ab.exe command line  looks like this: ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld which hits the specified URL 100,000 times with a load factor of 20 concurrent requests. This results in output like this:   It's a great way to get a quick and dirty performance summary. Run it a few times to make sure there's not a large amount of varience. You might also want to do an IISRESET to clear the Web Server. Just make sure you do a short test run to warm up the server first - otherwise your first run is likely to be skewed downwards. ab.exe also allows you to specify headers and provide POST data and many other things if you want to get a little more fancy. Here all tests are GET requests to keep it simple. I ran each test: 100,000 iterations Load factor of 20 concurrent connections IISReset before starting A short warm up run for API and MVC to make sure startup cost is mitigated Here is the batch file I used for the test: IISRESET REM make sure you add REM C:\Program Files (x86)\Apache Software Foundation\Apache2.2\bin REM to your path so ab.exe can be found REM Warm up ab.exe -n100 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJsonab.exe -n100 -c20 http://localhost/aspnetperf/api/HelloWorldJson ab.exe -n100 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld ab.exe -n100000 -c20 http://localhost/aspnetperf/handler.ashx > handler.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_CodeBehind.aspx > AspxCodeBehind.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_Markup.aspx > AspxMarkup.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld > Wcf.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldCode > Mvc.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld > WebApi.txt I ran each of these tests 3 times and took the average score for Requests/second, with the machine otherwise idle. I did see a bit of variance when running many tests but the values used here are the medians. Part of this has to do with the fact I ran the tests on my local machine - result would probably more consistent running the load test on a separate machine hitting across the network. I ran these tests locally on my laptop which is a Dell XPS with quad core Sandibridge I7-2720QM @ 2.20ghz and a fast SSD drive on Windows 8. CPU load during tests ran to about 70% max across all 4 cores (IOW, it wasn't overloading the machine). Ideally you can try running these tests on a separate machine hitting the local machine. If I remember correctly IIS 7 and 8 on client OSs don't throttle so the performance here should be Results Ok, let's cut straight to the chase. Below are the results from the tests… It's not surprising that the handler was fastest. But it was a bit surprising to me that the next fastest was WebForms and especially Web Forms with markup over a CodeBehind page. WebPages also fared fairly well. MVC and WebAPI are a little slower and the slowest by far is WCF REST (which again I find surprising). As mentioned at the start the raw throughput tests are not overly practical as they don't test scripting performance for the HTML generation engines or serialization performances of the data engines. All it really does is give you an idea of the raw throughput for the technology from time of request to reaching the endpoint and returning minimal text data back to the client which indicates full round trip performance. But it's still interesting to see that Web Forms performs better in throughput than either MVC, WebAPI or WebPages. It'd be interesting to try this with a few pages that actually have some parsing logic on it, but that's beyond the scope of this throughput test. But what's also amazing about this test is the sheer amount of traffic that a laptop computer is handling. Even the slowest tech managed 5700 requests a second, which is one hell of a lot of requests if you extrapolate that out over a 24 hour period. Remember these are not static pages, but dynamic requests that are being served. Another test - JSON Data Service Results The second test I used a JSON result from several of the technologies. I didn't bother running WebForms and WebPages through this test since that doesn't make a ton of sense to return data from the them (OTOH, returning text from the APIs didn't make a ton of sense either :-) In these tests I have a small Person class that gets serialized and then returned to the client. The Person class looks like this: public class Person { public Person() { Id = 10; Name = "Rick"; Entered = DateTime.Now; } public int Id { get; set; } public string Name { get; set; } public DateTime Entered { get; set; } } Here are the updated handler classes that use Person: Handler public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { var action = context.Request.QueryString["action"]; if (action == "json") JsonRequest(context); else TextRequest(context); } public void TextRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public void JsonRequest(HttpContext context) { var json = JsonConvert.SerializeObject(new Person(), Formatting.None); context.Response.ContentType = "application/json"; context.Response.Write(json); } public bool IsReusable { get { return true; } } } This code adds a little logic to check for a action query string and route the request to an optional JSON result method. To generate JSON, I'm using the same JSON.NET serializer (JsonConvert.SerializeObject) used in Web API to create the JSON response. WCF REST   [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World " + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } [OperationContract] [WebGet(ResponseFormat=WebMessageFormat.Json,BodyStyle=WebMessageBodyStyle.WrappedRequest)] public Person HelloWorldJson() { // Add your operation implementation here return new Person(); } } For WCF REST all I have to do is add a method with the Person result type.   ASP.NET MVC public class MvcPerformanceController : Controller { // // GET: /MvcPerformance/ public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } public JsonResult HelloWorldJson() { return Json(new Person(), JsonRequestBehavior.AllowGet); } } For MVC all I have to do for a JSON response is return a JSON result. ASP.NET internally uses JavaScriptSerializer. ASP.NET WebAPI public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } [HttpGet] public Person HelloWorldJson() { return new Person(); } [HttpGet] public HttpResponseMessage HelloWorldJson2() { var response = new HttpResponseMessage(HttpStatusCode.OK); response.Content = new ObjectContent<Person>(new Person(), GlobalConfiguration.Configuration.Formatters.JsonFormatter); return response; } } Testing and Results To run these data requests I used the following ab.exe commands:REM JSON RESPONSES ab.exe -n100000 -c20 http://localhost/aspnetperf/Handler.ashx?action=json > HandlerJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJson > MvcJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorldJson > WebApiJson.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorldJson > WcfJson.txt The results from this test run are a bit interesting in that the WebAPI test improved performance significantly over returning plain string content. Here are the results:   The performance for each technology drops a little bit except for WebAPI which is up quite a bit! From this test it appears that WebAPI is actually significantly better performing returning a JSON response, rather than a plain string response. Snag with Apache Benchmark and 'Length Failures' I ran into a little snag with Apache Benchmark, which was reporting failures for my Web API requests when serializing. As the graph shows performance improved significantly from with JSON results from 5580 to 6530 or so which is a 15% improvement (while all others slowed down by 3-8%). However, I was skeptical at first because the WebAPI test reports showed a bunch of errors on about 10% of the requests. Check out this report: Notice the Failed Request count. What the hey? Is WebAPI failing on roughly 10% of requests when sending JSON? Turns out: No it's not! But it took some sleuthing to figure out why it reports these failures. At first I thought that Web API was failing, and so to make sure I re-ran the test with Fiddler attached and runiisning the ab.exe test by using the -X switch: ab.exe -n100 -c10 -X localhost:8888 http://localhost/aspnetperf/api/HelloWorldJson which showed that indeed all requests where returning proper HTTP 200 results with full content. However ab.exe was reporting the errors. After some closer inspection it turned out that the dates varying in size altered the response length in dynamic output. For example: these two results: {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.841926-10:00"} {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.8519262-10:00"} are different in length for the number which results in 68 and 69 bytes respectively. The same URL produces different result lengths which is what ab.exe reports. I didn't notice at first bit the same is happening when running the ASHX handler with JSON.NET result since it uses the same serializer that varies the milliseconds. Moral: You can typically ignore Length failures in Apache Benchmark and when in doubt check the actual output with Fiddler. Note that the other failure values are accurate though. Another interesting Side Note: Perf drops over Time As I was running these tests repeatedly I was finding that performance steadily dropped from a startup peak to a 10-15% lower stable level. IOW, with Web API I'd start out with around 6500 req/sec and in subsequent runs it keeps dropping until it would stabalize somewhere around 5900 req/sec occasionally jumping lower. For these tests this is why I did the IIS RESET and warm up for individual tests. This is a little puzzling. Looking at Process Monitor while the test are running memory very quickly levels out as do handles and threads, on the first test run. Subsequent runs everything stays stable, but the performance starts going downwards. This applies to all the technologies - Handlers, Web Forms, MVC, Web API - curious to see if others test this and see similar results. Doing an IISRESET then resets everything and performance starts off at peak again… Summary As I stated at the outset, these were informal to satiate my curiosity not to prove that any technology is better or even faster than another. While there clearly are differences in performance the differences (other than WCF REST which was by far the slowest and the raw handler which was by far the highest) are relatively minor, so there is no need to feel that any one technology is a runaway standout in raw performance. Choosing a technology is about more than pure performance but also about the adequateness for the job and the easy of implementation. The strengths of each technology will make for any minor performance difference we see in these tests. However, to me it's important to get an occasional reality check and compare where new technologies are heading. Often times old stuff that's been optimized and designed for a time of less horse power can utterly blow the doors off newer tech and simple checks like this let you compare. Luckily we're seeing that much of the new stuff performs well even in V1.0 which is great. To me it was very interesting to see Web API perform relatively badly with plain string content, which originally led me to think that Web API might not be properly optimized just yet. For those that caught my Tweets late last week regarding WebAPI's slow responses was with String content which is in fact considerably slower. Luckily where it counts with serialized JSON and XML WebAPI actually performs better. But I do wonder what would make generic string content slower than serialized code? This stresses another point: Don't take a single test as the final gospel and don't extrapolate out from a single set of tests. Certainly Twitter can make you feel like a fool when you post something immediate that hasn't been fleshed out a little more <blush>. Egg on my face. As a result I ended up screwing around with this for a few hours today to compare different scenarios. Well worth the time… I hope you found this useful, if not for the results, maybe for the process of quickly testing a few requests for performance and charting out a comparison. Now onwards with more serious stuff… Resources Source Code on GitHub Apache HTTP Server Project (ab.exe is part of the binary distribution)© Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET  Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Large File Upload in SharePoint 2010

    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). This feed URL has been discontinued. Please update your reader's URL to : http://feeds.feedburner.com/winsmarts Read full article .... ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Oracle Database Insider Now on LinkedIn

    - by Troy Kitch
    Our close friends over at the Oracle Database Insider blog have recently started a LinkedIn discussion group. Go behind the scenes of the latest Oracle Database announcements and discussions that include Oracle Database 11g and its options, such as Database Security, and the newest product, Oracle Exadata. Come on over to post a discussion topic, an event, ask questions and stay up-to-date on the latest Oracle Database information. We'll be there to join the discussions and answer questions. Join us on LinkedIn's latest group!

    Read the article

  • SQL Monitor’s data repository

    - by Chris Lambrou
    As one of the developers of SQL Monitor, I often get requests passed on by our support people from customers who are looking to dip into SQL Monitor’s own data repository, in order to pull out bits of information that they’re interested in. Since there’s clearly interest out there in playing around directly with the data repository, I thought I’d write some blog posts to start to describe how it all works. The hardest part for me is knowing where to begin, since the schema of the data repository is pretty big. Hmmm… I guess it’s tricky for anyone to write anything but the most trivial of queries against the data repository without understanding the hierarchy of monitored objects, so perhaps my first post should start there. I always imagine that whenever a customer fires up SSMS and starts to explore their SQL Monitor data repository database, they become immediately bewildered by the schema – that was certainly my experience when I did so for the first time. The following query shows the number of different object types in the data repository schema: SELECT type_desc, COUNT(*) AS [count] FROM sys.objects GROUP BY type_desc ORDER BY type_desc;  type_desccount 1DEFAULT_CONSTRAINT63 2FOREIGN_KEY_CONSTRAINT181 3INTERNAL_TABLE3 4PRIMARY_KEY_CONSTRAINT190 5SERVICE_QUEUE3 6SQL_INLINE_TABLE_VALUED_FUNCTION381 7SQL_SCALAR_FUNCTION2 8SQL_STORED_PROCEDURE100 9SYSTEM_TABLE41 10UNIQUE_CONSTRAINT54 11USER_TABLE193 12VIEW124 With 193 tables, 124 views, 100 stored procedures and 381 table valued functions, that’s quite a hefty schema, and when you browse through it using SSMS, it can be a bit daunting at first. So, where to begin? Well, let’s narrow things down a bit and only look at the tables belonging to the data schema. That’s where all of the collected monitoring data is stored by SQL Monitor. The following query gives us the names of those tables: SELECT sch.name + '.' + obj.name AS [name] FROM sys.objects obj JOIN sys.schemas sch ON sch.schema_id = obj.schema_id WHERE obj.type_desc = 'USER_TABLE' AND sch.name = 'data' ORDER BY sch.name, obj.name; This query still returns 110 tables. I won’t show them all here, but let’s have a look at the first few of them:  name 1data.Cluster_Keys 2data.Cluster_Machine_ClockSkew_UnstableSamples 3data.Cluster_Machine_Cluster_StableSamples 4data.Cluster_Machine_Keys 5data.Cluster_Machine_LogicalDisk_Capacity_StableSamples 6data.Cluster_Machine_LogicalDisk_Keys 7data.Cluster_Machine_LogicalDisk_Sightings 8data.Cluster_Machine_LogicalDisk_UnstableSamples 9data.Cluster_Machine_LogicalDisk_Volume_StableSamples 10data.Cluster_Machine_Memory_Capacity_StableSamples 11data.Cluster_Machine_Memory_UnstableSamples 12data.Cluster_Machine_Network_Capacity_StableSamples 13data.Cluster_Machine_Network_Keys 14data.Cluster_Machine_Network_Sightings 15data.Cluster_Machine_Network_UnstableSamples 16data.Cluster_Machine_OperatingSystem_StableSamples 17data.Cluster_Machine_Ping_UnstableSamples 18data.Cluster_Machine_Process_Instances 19data.Cluster_Machine_Process_Keys 20data.Cluster_Machine_Process_Owner_Instances 21data.Cluster_Machine_Process_Sightings 22data.Cluster_Machine_Process_UnstableSamples 23… There are two things I want to draw your attention to: The table names describe a hierarchy of the different types of object that are monitored by SQL Monitor (e.g. clusters, machines and disks). For each object type in the hierarchy, there are multiple tables, ending in the suffixes _Keys, _Sightings, _StableSamples and _UnstableSamples. Not every object type has a table for every suffix, but the _Keys suffix is especially important and a _Keys table does indeed exist for every object type. In fact, if we limit the query to return only those tables ending in _Keys, we reveal the full object hierarchy: SELECT sch.name + '.' + obj.name AS [name] FROM sys.objects obj JOIN sys.schemas sch ON sch.schema_id = obj.schema_id WHERE obj.type_desc = 'USER_TABLE' AND sch.name = 'data' AND obj.name LIKE '%_Keys' ORDER BY sch.name, obj.name;  name 1data.Cluster_Keys 2data.Cluster_Machine_Keys 3data.Cluster_Machine_LogicalDisk_Keys 4data.Cluster_Machine_Network_Keys 5data.Cluster_Machine_Process_Keys 6data.Cluster_Machine_Services_Keys 7data.Cluster_ResourceGroup_Keys 8data.Cluster_ResourceGroup_Resource_Keys 9data.Cluster_SqlServer_Agent_Job_History_Keys 10data.Cluster_SqlServer_Agent_Job_Keys 11data.Cluster_SqlServer_Database_BackupType_Backup_Keys 12data.Cluster_SqlServer_Database_BackupType_Keys 13data.Cluster_SqlServer_Database_CustomMetric_Keys 14data.Cluster_SqlServer_Database_File_Keys 15data.Cluster_SqlServer_Database_Keys 16data.Cluster_SqlServer_Database_Table_Index_Keys 17data.Cluster_SqlServer_Database_Table_Keys 18data.Cluster_SqlServer_Error_Keys 19data.Cluster_SqlServer_Keys 20data.Cluster_SqlServer_Services_Keys 21data.Cluster_SqlServer_SqlProcess_Keys 22data.Cluster_SqlServer_TopQueries_Keys 23data.Cluster_SqlServer_Trace_Keys 24data.Group_Keys The full object type hierarchy looks like this: Cluster Machine LogicalDisk Network Process Services ResourceGroup Resource SqlServer Agent Job History Database BackupType Backup CustomMetric File Table Index Error Services SqlProcess TopQueries Trace Group Okay, but what about the individual objects themselves represented at each level in this hierarchy? Well that’s what the _Keys tables are for. This is probably best illustrated by way of a simple example – how can I query my own data repository to find the databases on my own PC for which monitoring data has been collected? Like this: SELECT clstr._Name AS cluster_name, srvr._Name AS instance_name, db._Name AS database_name FROM data.Cluster_SqlServer_Database_Keys db JOIN data.Cluster_SqlServer_Keys srvr ON db.ParentId = srvr.Id -- Note here how the parent of a Database is a Server JOIN data.Cluster_Keys clstr ON srvr.ParentId = clstr.Id -- Note here how the parent of a Server is a Cluster WHERE clstr._Name = 'dev-chrisl2' -- This is the hostname of my own PC ORDER BY clstr._Name, srvr._Name, db._Name;  cluster_nameinstance_namedatabase_name 1dev-chrisl2SqlMonitorData 2dev-chrisl2master 3dev-chrisl2model 4dev-chrisl2msdb 5dev-chrisl2mssqlsystemresource 6dev-chrisl2tempdb 7dev-chrisl2sql2005SqlMonitorData 8dev-chrisl2sql2005TestDatabase 9dev-chrisl2sql2005master 10dev-chrisl2sql2005model 11dev-chrisl2sql2005msdb 12dev-chrisl2sql2005mssqlsystemresource 13dev-chrisl2sql2005tempdb 14dev-chrisl2sql2008SqlMonitorData 15dev-chrisl2sql2008master 16dev-chrisl2sql2008model 17dev-chrisl2sql2008msdb 18dev-chrisl2sql2008mssqlsystemresource 19dev-chrisl2sql2008tempdb These results show that I have three SQL Server instances on my machine (a default instance, one named sql2005 and one named sql2008), and each instance has the usual set of system databases, along with a database named SqlMonitorData. Basically, this is where I test SQL Monitor on different versions of SQL Server, when I’m developing. There are a few important things we can learn from this query: Each _Keys table has a column named Id. This is the primary key. Each _Keys table has a column named ParentId. A foreign key relationship is defined between each _Keys table and its parent _Keys table in the hierarchy. There are two exceptions to this, Cluster_Keys and Group_Keys, because clusters and groups live at the root level of the object hierarchy. Each _Keys table has a column named _Name. This is used to uniquely identify objects in the table within the scope of the same shared parent object. Actually, that last item isn’t always true. In some cases, the _Name column is actually called something else. For example, the data.Cluster_Machine_Services_Keys table has a column named _ServiceName instead of _Name (sorry for the inconsistency). In other cases, a name isn’t sufficient to uniquely identify an object. For example, right now my PC has multiple processes running, all sharing the same name, Chrome (one for each tab open in my web-browser). In such cases, multiple columns are used to uniquely identify an object within the scope of the same shared parent object. Well, that’s it for now. I’ve given you enough information for you to explore the _Keys tables to see how objects are stored in your own data repositories. In a future post, I’ll try to explain how monitoring data is stored for each object, using the _StableSamples and _UnstableSamples tables. If you have any questions about this post, or suggestions for future posts, just submit them in the comments section below.

    Read the article

  • Adding Descriptive Flex Field (DFF) through OAF Personalization

    - by Manoj Madhusoodanan
    In this blog I will explain how to add a DFF to a existing OAF page through personalization.I am using Supplier Quick Update Page ( /oracle/apps/pos/supplier/webui/SuppSummPG ). If you want to see how to create DFF please click here. In this scenario I am using a custom DFF. Following are the details. Application -> Payables ( Code: SQLAP )Name -> XXCUST_SUPPLIER_DFFTitle -> XXCUST - Supplier DFFTable Name -> AP_SUPPLIERSDFV View name -> XXCUST_SUPPLIER_DFVReference Fields -> ATTRIBUTE_CATEGORY Following are the Context Field Details. Prompt -> Supplier TypeValue Set -> XXCUST_SUP_TYPE ( Values : External and Internal )Reference Field -> ATTRIBUTE_CATEGORY Below table shows the segment details of XXCUST_SUPPLIER_DFF. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Code Segments Column Value Set Global Data Elements Identification Number ATTRIBUTE1 15 Characters External Type ATTRIBUTE2 XXCUST_EXT_SUP_TYPE Values          Domestic           International Internal Department ATTRIBUTE2 15 Characters Following steps you need to perform to create flex item in the Quick Update page. 1) Click on Personalize Page.In the Personalize Page click on Complete View. 2) Click on Create Item.( Based on where you want to place the DFF choose appropriate layout). 3) Create flex item with following details. 4) If you want to arrange the item in the page click on Reorder. Following is the output.

    Read the article

  • SQLAuthority News – Reliving TechEd with Vinod Kumar at Bangalore User Groups

    - by pinaldave
    TechEd India 2012 was held in Bangalore last March 21 to 23, 2012. Just like every year, this event is bigger, grander and inspiring. Here is my blog post reviewing the event SQLAuthority News – #TechEdIn – TechEd India 2012 Memories and Photos. For me this is family event – I get to meet my friends who are dear as my family. I like to call User Groups as family too. Family shares life’s personal happiness and experience – the same way User Group shares professional experiences and quite often UG members become just like family member. When I learned that follower user group together building up a unique event I was pretty excited to learn who is going to be speaker for the event. BDotNet.in – Bangalore .NET Usergroup BITPro.in – Bangalore ITPro Usergroup It was indeed joy when I learned that presenter will be Vinod Kumar, who is integral part of user groups and hardcore SQL Server enthusiast. Vinod Kumar is going to present on following two sessions which are both focused on internals of the Windows and SQL Server. Understanding Windows with SysInternals Tools – This session will cover various tools from usage of Memory, x86 architecture, x64, WOW mode, Page faults, Virtual Memory mapping, OOM scenario, Perf Tool, PAL tool, Logman and more. Peeling the Onion: SQL Server Internals Demystified – This session will cover advanced disk formats, SQL Server 2012 security changes, memory changes, indirect checkPoint and more. I am very excited as this time I will get opportunity to sit in front rows (as I will be reaching there to get best possible position) and learn. I am looking forward to the event and I hope you will join us as well. Event Details: Date: Saturday, April 7, 2012 (10:30am until 1:30pm) Venue: Microsoft, Domlur, Bangalore. Event Details: https://www.facebook.com/events/139444029517882/ This session is FREE for all and everybody and anybody can walk in. Community Blog Posts Here are few of the blog post written by the community on this subject. Vinod Kumar on Reliving #TechEdIn at Blr UG Manas Dash on Reliving TechEd India 2012 with Vinod Kumar Sudeepta Ganguly on SysInternals n SQLInternals with Vinod Kumar Lohith Re Live TechEd India 2012 with Vinod Kumar  Reference: Pinal Dave (http://blog.sqlauthority.com) http://www.youtube.com/watch?v=oRw-p4mahLU Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, T SQL, Technology, Video

    Read the article

  • Viewing the Future Through the ‘Eyes of the Past’ [Humorous Image]

    - by Asian Angel
    Really makes you feel nostalgic, eh? You can access the full-size version to get a better view of the upper right corner here. O_O This is a close approximation of the original title of the post. [via Reddit - Tech Support Gore] How To Get a Better Wireless Signal and Reduce Wireless Network Interference How To Troubleshoot Internet Connection Problems 7 Ways To Free Up Hard Disk Space On Windows

    Read the article

  • CodePlex Daily Summary for Saturday, May 15, 2010

    CodePlex Daily Summary for Saturday, May 15, 2010New ProjectsBizTalk EDI Guidance: BizTalk EDI Guidance is intended to simplify the delivery of EDI solutions by leveraging the ESB Toolkit. This project is currently Alpha and sh...Continues Integration Sample: I'm providing a series of blog post to show a complete CI process using CruiseControl.Net and msbuild. The source code for this series is hosted here.DioM2D: My Dragons in our Midst RPG. Runs on my custom Starlight Engine.Ethical Hacking ASP.NET: Security tools and guidelines for white-hat hacking and protecting ASP.NET web applications.Farseer Engine with XNATouch: Farseer is great engine for game physics. This implementation uses XNATouch framework.Feature Builder Guidance Extensions: Feature Builder Guidance Extensions are Feature Extensions which extend the guidance for the Feature Building experience. Each FBGX will be suppli...Microsoft Office Document Security: MODS is a plugin for office 2007 thats includes Hash Encryption, Hex Convertion and more. Plugins: MODS For Word still working on (MODS for Excel ...Minimize Engine (XNA): The Minimize Engine is a basic 3D Games Engine created using XNA, with its primary focus around Grid Based games.MSForge TownCrier: This project is meant to build a notification and calling system for MSForge.net User Groups.NatureProtector: Silverlight 4 project.OutSync: OutSync is a free Windows desktop application that syncs photos of your Facebook friends with matching contacts in Microsoft Outlook. It allows you...Quick Save Images, Clipboard save to file, Quick save, bmp, png, jpeg, Image: ClipSa is a very small tool for very quick picture saving. You put some picture into the clipboard (PrintScrn/Alt-PrintScrn/Ctrl-C), ClipSa saves ...ResHelper Manager: Resource strings management tool that creates localization files for any type of localization target (asp.net, wpf and so on...)SecureCookieHttpModule: Secure your session cookie (and other session-based) cookies for replay attacks using this easy to use ASP.NET HttpModule.simpleChMS: A Church Management System (ChMS) designed for churches or ministries like youth groups that want to facilitate better care or theie membership. Fo...sMAPtool: -SPDomainObject: mapping strong type objects to sp listsSQL Trim: This project aims at developing a universal trim function for Microsoft SQL Server. It trims: 1) pre spaces 2) post spaces 3) double spaces 3) subs...TurretGunner: mt-experienceNew ReleasesBeanProxy: BeanProxy 3.0: BeanProxy is a C# (.NET 3.5) library housing classes that facilitates unit testing. Any non-static, public interface/class or abstract class can be...Blueset Studio Opensource Projects: 蓝色之风记事本 0.2 Alpha: 一个超级Bug版本……CSharp Intellisense: V2.1: - Bug fix (Pascal Casing)DioM2D: DioM2D0.01: http://www.dragonsinourmidst.com/forums/showthread.php?p=690058#post690058Ethical Hacking ASP.NET: Version 1.0.0.1: This is the initial release of the project. Read more about the available tests and features on the Documentation tab. You need the full .NET Frame...Event Scavenger: Collector service update - version 3.2.4: Added check if the database connection string is set up in the config file.Feature Builder Guidance Extensions: FBGX-Binaries: This release consists of a zip file containing all the VSIXs resulting from building each of the FBGX packages found here as source. This will mak...Floe IRC Client: Floe IRC Client 2010-05 R2: - Detaching windows (right click on the tabs to detach them) - Highlight lines with your nick or other patterns - Fixed several bugs - Tabs can now...Free language translator and file converter: Free Language Translator 1.96: Fixed some minor bugs and improved the UI a bit. If you can not install the msi file you might be missing some prerequisites. You can try running t...Geocache Downloader: release 1.0: This is the first release.kp.net: Alpha release is avalable: The goal of this alpha release is to try the code in some production scenarios and find out what features should be tuned.Live-Exchange Calendar Sync: Live-Exchange Calendar Sync: Live-Exchange Calendar Sync Beta May 14, 2010 release of Live-Exchange Calendar Sync 1.0 BETA. (Version 45334) Getting StartedInfo about installat...MAPILab Explorer for SharePoint: MAPILab Explorer for SharePoint ver 2.1.1: 1) Small bug fixed that appears on first start (when earliers versions wasn't installed). How to install:Download ZIP file and extract it on Sha...Microsoft Office Document Security: MODS 4 WORD (SOURCE INCLUDED): Includes Source CodeMoonyDesk (windows desktop widgets): MoonyDesk Alpha: MoonyDesk Alpha (some memory improvements)OnTopReplica: Release 2.9.3: Some bugfixes and improvements. Czech translation added (thanks René Mihula).OutSync: OutSync v1.0.100.0: OutSync v1.0.100.0 is the final release by Mel before the move to CodePlex. I have tested it on Windows 7 32bit and 64bit with Office 2007 and it ...Quick Save Images, Clipboard save to file, Quick save, bmp, png, jpeg, Image: Clipsa v 0.1: Download and extract to any place 2 files - clipSa.exe and clipSa.exe.config Run clipSa.exe. That's all.ResHelper Manager: ResHelperManager: List of changes applied to this version of ResHelper is included in main download zip package. Example sourcesIn Source Code tab are sources of De...Rx Contrib: V1.3: - Bug Fix - BufferWithTimeOrCount with flexible time period setting when ever the time period elapsed...SharePoint DVK Integration: SharePoint 2007 DVK integration v1.0.3: Fixes Fixed default field bindings. I rebound too many fields on every page load. Fixed extension replacing on creating target url (threw it out)...ShoutcastStast for DotNetNuke: DNN_ShoutcastStats alpha 05.00.495: First Alpha release of ShoutcastStats Module for DotNetNuke This first alpha version of the ShoutcastStats Module for DotNetNuke is still in devel...SilverPart 2.1: SilverPart 2.1: SilverPart 2.1 This interim release fixes some major bugs related to Firefox and anonymous access. - Fix for Issue ID 4005 - SilverPart does not w...sMAPtool: sMAPedit v0.7c (Base Release with Maps): Fixed: force a gargabe collection update to prevent pictureBox's memory leak Added: essential map pack with all basic maps in jpg format Added:...SQL Trim: Trim: Initial releaseSSIS Multiple Hash: Multiple Hash V1.2.1: This is version 1.2.1 of the Multiple Hash SSIS Component. It supports SQL 2005 and SQL 2008, although you have to download the correct install pa...StreamInsight Yahoo Finance input adapter example: StockTicker_v1_0_RTM: Updated for StreamInsight RTM.Update Controls .NET: 2.1.0.0: Automatic dependency management for WPF and Silverlight data binding. This release combines both the WPF and Silverlight assemblies into one insta...VCC: Latest build, v2.1.30514.0: Automatic drop of latest buildMost Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryMicrosoft SQL Server Community & SamplesPHPExcelASP.NETMost Active Projectspatterns & practices – Enterprise LibraryMirror Testing SystemRawrPHPExcelBlogEngine.NETMicrosoft Biology FoundationCustomer Portal Accelerator for Microsoft Dynamics CRMWindows Azure Command-line Tools for PHP DevelopersShake - C# MakeStyleCop

    Read the article

  • Oracle's Cloud Computing Events

    - by Peeyush Tugnawat
    Here is a useful link to Oracle full day events on Cloud Computing worldwide http://www.oracle.com/events/cloudcomputing/index.html   Other Oracle Cloud Computing Resources Oracle's Cloud Computing Products and Services Oracle's Cloud Computing Resource Center   Others My Previous Post about Cloud Computing

    Read the article

  • Bare Metal Restore Part 2

    - by GrumpyOldDBA
    I blogged previously about how Windows 2008 R2 has native "bare metal restore"   http://sqlblogcasts.com/blogs/grumpyolddba/archive/2011/05/13/windows-2008-r2-bare-metal-restore.aspx , see the Core Team's blog post here;  http://blogs.technet.com/b/askcore/archive/2011/05/12/bare-metal-restore.aspx Well since then I’ve actually had the chance not only to put the process to the test but to see if I could go one step further. I have a six identical IBM Servers, part of...(read more)

    Read the article

  • 10 PowerShell One Liners

    - by BizTalk Visionary
    Here are a few one-liners that use NetCmdlets. Some of these I've blogged about before, some are new. Let me know if you have questions, which ones you find useful, or how you altered these to suit your own needs. Send email to a list of recipient addresses: import-csv users.csv | % { send-email -to $_.email -from [email protected] -subject "Important Email" –message "Hello World!" -server 10.0.1.1 } Show the access control list for a specific Exchange folder: get-imap -server $mymailserver -cred $mycred -folder INBOX.RESUMES –acl Add look and read permissions on an Exchange folder, for a list of accounts pulled from a CSV file: import-csv users.csv | % { set-imap -server -acluser $_.username $mymailserver -cred $mycred -folder INBOX.RESUMES –acl “lr”  } Sync system time with an Internet time server: get-time -server clock.psu.edu –set To remotely sync the time on a set of computers: import-csv computers.csv | % { Invoke-Command -computerName $_.computer -cred $mycred -scriptblock { get-time -server clock.psu.edu –set } } Delete all emails from an Exchange folder that match a certain criteria.  For example, delete all emails from [email protected]: get-imap -server $mailserver –cred $mycred | ? {$_.FromEmail -eq [email protected]} | %{ set-imap -server $mailserver –cred $mycred-message $_.Id -delete } Update Twitter status from PowerShell: get-http –url "http://twitter.com/statuses/update.xml" –cred $mycred -variablename status -variablevalue "Tweeting with NetCmdlets!" A test-path that works over FTP, FTPS (SSL), and SFTP (SSH) connections: get-ftp -server $remoteserver –cred $mycred -path /remote/path/to/checkfor* Don't forget the *.  Also, to use SSL or SSH just add an –ssl or –ssh parameter. List disabled user accounts in Active Directory (or any other LDAP server): get-ldap -server $ad -cred $mycred -dn dc=yourdc -searchscope wholesubtree     -search "(&(objectclass=user)(objectclass=person)(company=*)(userAccountControl:1.2.840.113556.1.4.803:=2))" List Active Directory groups and their members: get-ldap -server testman -cred $mycred -dn dc=NS2 -searchscope wholesubtree -search "(&(objectclass=group)(cn=*admin*))" | select ResultDN, member Display the last initialization time (e.g. last reboot time) of all discoverable SNMP agents on a network: import-csv computers.csv | % { get-snmp -agent $_.computer -oid sysUpTime.0 | %{([datetime]::Now).AddSeconds(-($_.OIDValue/100))} } Not mentioned here:  data conversion (Yenc, QP, UUencoding, MD5, SHA1, base64, etc), DNS, News Groups (NNTP/UseNet), POP mail, RSS feeds, Amazon S3, Syslog, TFTP, TraceRoute, SNMP Traps, UDP, WebDAV, whois, Rexec/Rshell/Telnet, Zip files, sending IMs (Jabber/GoogleTalk/XMPP), sending text messages and pages, ping, and more. Original Source: Lance's Textbox

    Read the article

  • Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It

    - by Chris Hoffman
    Several times a year, Google releases a new version of Android with new features and performance improvements. Unfortunately, most Android devices in the wild will never get the update. New Android users are often disappointed to discover that their shiny new smartphone won’t get any updates – or worse, that it was running old software from the moment they bought it. Image Credit: Johan Larsson on Flickr Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It How To Delete, Move, or Rename Locked Files in Windows HTG Explains: Why Screen Savers Are No Longer Necessary

    Read the article

  • How can I keep a folder synchronized to an external USB hard drive in Ubuntu?

    - by Cesar
    I have a growing music collection which I manually keep in sync with an external USB drive. Sometimes I edit their ID3 tags, add or delete a file in either the hard drive or the USB drive, and I would like to keep those changes synchronized between both. Does Ubuntu has something available that would help me with this scenario? Preferably something easy to use with a UI. Update: To clarify my question, changes may happen on both the local hard drive or the USB drive, so the sync process must be on both directions.

    Read the article

  • Unresolved external symbol __imp____glewGenerateMipmap

    - by Tsvetan
    This error is given by Visual Studio 2010, when I want to compile my C++ code. I have added 'glew32.lib' and 'freeglut.lib' in Additional Dependencies, both release and debug. Also included the header files. I have searched the Internet and found only one forum post, but it isn't the solution I am searching for... My library is dynamic, so GLEW_STATIC is not an option. So, can you give me a solution for this problem?

    Read the article

  • Implementing Master-Detail functionality with the ASPxGridView control

    - by nikolaosk
    I have been involved with a ASP.Net project recently and I have implemented using the awesome DevExpress ASP.Net controls. In this post I will show you how to implement Master-Detail functionality using the ASPxGridView control. If you want to implement this example you need to download the trial version of these controls unless you are a licensed holder of DevExpress products. We will need a database to work with.I will use the AdventureWorkLT database. If you need the installation scripts for the...(read more)

    Read the article

  • Silverlight Cream for April 28, 2010 -- #850

    - by Dave Campbell
    In this Issue: Giorgetti Alessandro, Alexander Strauss, Mahesh Sabnis, Andrea Boschin, Maxim Goldin, Peter Torr, Wolf Schmidt, and Marlon Grech. Shoutout: Koen Zwikstra announced a SL4 update: Silverlight Spy 3.0.0.11 Adam Kinney posted a WTF Step by Step guide to installing Silverlight Tools David Makogon posted his materials from a presentation: RockNUG April 2010 Materials: Silverlight 4 From SilverlightCream.com: Silverlight, M-V-VM ... and IoC - part 4 Giorgetti Alessandro isn't wasting any time... he's already gotten Part 4 of his MVVM, IoC, and Silverlight series up. He's discussing commanding. He gives some good external links and develops in his own direction as well. Application Partitioning with MEF, Silverlight and Windows Azure – Part II Alexander Strauss has the second and final part of his MEF/Silverlight/Azuer posts up, describing getting XAP information from Azure Blob storage. Simple Databinding and 3-D Features using Silverlight in Windows Phone 7 (WP7) Mahesh Sabnis has a post up combining DataBinding and 3D displays on WP7 ... good long tutorial and source. Keeping an ObservableCollection sorted with a method override Andrea Boschin details the reasons behind his need for having a sorted ObservableCollection, then hands over the code he used to do so. VS2010: Silverlight 4 profiling Maxim Goldin posted about profiling Silverlight 4 in VS2010. It's not overly straightforward but once you do it a couple times, not a big deal ... check out the comments as well. Peter Torr: Mock Location APIs from my Mix10 Talk A discussion came up on the insider's list this morning asking about Location Service in the emulator. Laurent Bugnion pointed us at Peter Torr's Mock Location from his MIX10 talk. Finding the "real" templates and generic.xaml in Silverlight core or library assemblies, by using .NET Reflector Wolf Schmidt at the Silverlight SDK has a post up about using .NET Reflector to rat around in Silverlight core or library assemblies. How does MEFedMVVM compose the catalogs and how can I override the behavior? – MEFedMVVM Part 4 Marlon Grech has Part 4 of his MEFedMVVM series up and this one is for advanced use of MEFedMVVM... where you're writing a composer and how that would be different for Silverlight and WPF... oh yeah, and what is a composer as well :) Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Syncing Data with a Server using Silverlight and HTTP Polling Duplex

    - by dwahlin
    Many applications have the need to stay in-sync with data provided by a service. Although web applications typically rely on standard polling techniques to check if data has changed, Silverlight provides several interesting options for keeping an application in-sync that rely on server “push” technologies. A few years back I wrote several blog posts covering different “push” technologies available in Silverlight that rely on sockets or HTTP Polling Duplex. We recently had a project that looked like it could benefit from pushing data from a server to one or more clients so I thought I’d revisit the subject and provide some updates to the original code posted. If you’ve worked with AJAX before in Web applications then you know that until browsers fully support web sockets or other duplex (bi-directional communication) technologies that it’s difficult to keep applications in-sync with a server without relying on polling. The problem with polling is that you have to check for changes on the server on a timed-basis which can often be wasteful and take up unnecessary resources. With server “push” technologies, data can be pushed from the server to the client as it changes. Once the data is received, the client can update the user interface as appropriate. Using “push” technologies allows the client to listen for changes from the data but stay 100% focused on client activities as opposed to worrying about polling and asking the server if anything has changed. Silverlight provides several options for pushing data from a server to a client including sockets, TCP bindings and HTTP Polling Duplex.  Each has its own strengths and weaknesses as far as performance and setup work with HTTP Polling Duplex arguably being the easiest to setup and get going.  In this article I’ll demonstrate how HTTP Polling Duplex can be used in Silverlight 4 applications to push data and show how you can create a WCF server that provides an HTTP Polling Duplex binding that a Silverlight client can consume.   What is HTTP Polling Duplex? Technologies that allow data to be pushed from a server to a client rely on duplex functionality. Duplex (or bi-directional) communication allows data to be passed in both directions.  A client can call a service and the server can call the client. HTTP Polling Duplex (as its name implies) allows a server to communicate with a client without forcing the client to constantly poll the server. It has the benefit of being able to run on port 80 making setup a breeze compared to the other options which require specific ports to be used and cross-domain policy files to be exposed on port 943 (as with sockets and TCP bindings). Having said that, if you’re looking for the best speed possible then sockets and TCP bindings are the way to go. But, they’re not the only game in town when it comes to duplex communication. The first time I heard about HTTP Polling Duplex (initially available in Silverlight 2) I wasn’t exactly sure how it was any better than standard polling used in AJAX applications. I read the Silverlight SDK, looked at various resources and generally found the following definition unhelpful as far as understanding the actual benefits that HTTP Polling Duplex provided: "The Silverlight client periodically polls the service on the network layer, and checks for any new messages that the service wants to send on the callback channel. The service queues all messages sent on the client callback channel and delivers them to the client when the client polls the service." Although the previous definition explained the overall process, it sounded as if standard polling was used. Fortunately, Microsoft’s Scott Guthrie provided me with a more clear definition several years back that explains the benefits provided by HTTP Polling Duplex quite well (used with his permission): "The [HTTP Polling Duplex] duplex support does use polling in the background to implement notifications – although the way it does it is different than manual polling. It initiates a network request, and then the request is effectively “put to sleep” waiting for the server to respond (it doesn’t come back immediately). The server then keeps the connection open but not active until it has something to send back (or the connection times out after 90 seconds – at which point the duplex client will connect again and wait). This way you are avoiding hitting the server repeatedly – but still get an immediate response when there is data to send." After hearing Scott’s definition the light bulb went on and it all made sense. A client makes a request to a server to check for changes, but instead of the request returning immediately, it parks itself on the server and waits for data. It’s kind of like waiting to pick up a pizza at the store. Instead of calling the store over and over to check the status, you sit in the store and wait until the pizza (the request data) is ready. Once it’s ready you take it back home (to the client). This technique provides a lot of efficiency gains over standard polling techniques even though it does use some polling of its own as a request is initially made from a client to a server. So how do you implement HTTP Polling Duplex in your Silverlight applications? Let’s take a look at the process by starting with the server. Creating an HTTP Polling Duplex WCF Service Creating a WCF service that exposes an HTTP Polling Duplex binding is straightforward as far as coding goes. Add some one way operations into an interface, create a client callback interface and you’re ready to go. The most challenging part comes into play when configuring the service to properly support the necessary binding and that’s more of a cut and paste operation once you know the configuration code to use. To create an HTTP Polling Duplex service you’ll need to expose server-side and client-side interfaces and reference the System.ServiceModel.PollingDuplex assembly (located at C:\Program Files (x86)\Microsoft SDKs\Silverlight\v4.0\Libraries\Server on my machine) in the server project. For the demo application I upgraded a basketball simulation service to support the latest polling duplex assemblies. The service simulates a simple basketball game using a Game class and pushes information about the game such as score, fouls, shots and more to the client as the game changes over time. Before jumping too far into the game push service, it’s important to discuss two interfaces used by the service to communicate in a bi-directional manner. The first is called IGameStreamService and defines the methods/operations that the client can call on the server (see Listing 1). The second is IGameStreamClient which defines the callback methods that a server can use to communicate with a client (see Listing 2).   [ServiceContract(Namespace = "Silverlight", CallbackContract = typeof(IGameStreamClient))] public interface IGameStreamService { [OperationContract(IsOneWay = true)] void GetTeamData(); } Listing 1. The IGameStreamService interface defines server operations that can be called on the server.   [ServiceContract] public interface IGameStreamClient { [OperationContract(IsOneWay = true)] void ReceiveTeamData(List<Team> teamData); [OperationContract(IsOneWay = true, AsyncPattern=true)] IAsyncResult BeginReceiveGameData(GameData gameData, AsyncCallback callback, object state); void EndReceiveGameData(IAsyncResult result); } Listing 2. The IGameStreamClient interfaces defines client operations that a server can call.   The IGameStreamService interface is decorated with the standard ServiceContract attribute but also contains a value for the CallbackContract property.  This property is used to define the interface that the client will expose (IGameStreamClient in this example) and use to receive data pushed from the service. Notice that each OperationContract attribute in both interfaces sets the IsOneWay property to true. This means that the operation can be called and passed data as appropriate, however, no data will be passed back. Instead, data will be pushed back to the client as it’s available.  Looking through the IGameStreamService interface you can see that the client can request team data whereas the IGameStreamClient interface allows team and game data to be received by the client. One interesting point about the IGameStreamClient interface is the inclusion of the AsyncPattern property on the BeginReceiveGameData operation. I initially created this operation as a standard one way operation and it worked most of the time. However, as I disconnected clients and reconnected new ones game data wasn’t being passed properly. After researching the problem more I realized that because the service could take up to 7 seconds to return game data, things were getting hung up. By setting the AsyncPattern property to true on the BeginReceivedGameData operation and providing a corresponding EndReceiveGameData operation I was able to get around this problem and get everything running properly. I’ll provide more details on the implementation of these two methods later in this post. Once the interfaces were created I moved on to the game service class. The first order of business was to create a class that implemented the IGameStreamService interface. Since the service can be used by multiple clients wanting game data I added the ServiceBehavior attribute to the class definition so that I could set its InstanceContextMode to InstanceContextMode.Single (in effect creating a Singleton service object). Listing 3 shows the game service class as well as its fields and constructor.   [ServiceBehavior(ConcurrencyMode = ConcurrencyMode.Multiple, InstanceContextMode = InstanceContextMode.Single)] public class GameStreamService : IGameStreamService { object _Key = new object(); Game _Game = null; Timer _Timer = null; Random _Random = null; Dictionary<string, IGameStreamClient> _ClientCallbacks = new Dictionary<string, IGameStreamClient>(); static AsyncCallback _ReceiveGameDataCompleted = new AsyncCallback(ReceiveGameDataCompleted); public GameStreamService() { _Game = new Game(); _Timer = new Timer { Enabled = false, Interval = 2000, AutoReset = true }; _Timer.Elapsed += new ElapsedEventHandler(_Timer_Elapsed); _Timer.Start(); _Random = new Random(); }} Listing 3. The GameStreamService implements the IGameStreamService interface which defines a callback contract that allows the service class to push data back to the client. By implementing the IGameStreamService interface, GameStreamService must supply a GetTeamData() method which is responsible for supplying information about the teams that are playing as well as individual players.  GetTeamData() also acts as a client subscription method that tracks clients wanting to receive game data.  Listing 4 shows the GetTeamData() method. public void GetTeamData() { //Get client callback channel var context = OperationContext.Current; var sessionID = context.SessionId; var currClient = context.GetCallbackChannel<IGameStreamClient>(); context.Channel.Faulted += Disconnect; context.Channel.Closed += Disconnect; IGameStreamClient client; if (!_ClientCallbacks.TryGetValue(sessionID, out client)) { lock (_Key) { _ClientCallbacks[sessionID] = currClient; } } currClient.ReceiveTeamData(_Game.GetTeamData()); //Start timer which when fired sends updated score information to client if (!_Timer.Enabled) { _Timer.Enabled = true; } } Listing 4. The GetTeamData() method subscribes a given client to the game service and returns. The key the line of code in the GetTeamData() method is the call to GetCallbackChannel<IGameStreamClient>().  This method is responsible for accessing the calling client’s callback channel. The callback channel is defined by the IGameStreamClient interface shown earlier in Listing 2 and used by the server to communicate with the client. Before passing team data back to the client, GetTeamData() grabs the client’s session ID and checks if it already exists in the _ClientCallbacks dictionary object used to track clients wanting callbacks from the server. If the client doesn’t exist it adds it into the collection. It then pushes team data from the Game class back to the client by calling ReceiveTeamData().  Since the service simulates a basketball game, a timer is then started if it’s not already enabled which is then used to randomly send data to the client. When the timer fires, game data is pushed down to the client. Listing 5 shows the _Timer_Elapsed() method that is called when the timer fires as well as the SendGameData() method used to send data to the client. void _Timer_Elapsed(object sender, ElapsedEventArgs e) { int interval = _Random.Next(3000, 7000); lock (_Key) { _Timer.Interval = interval; _Timer.Enabled = false; } SendGameData(_Game.GetGameData()); } private void SendGameData(GameData gameData) { var cbs = _ClientCallbacks.Where(cb => ((IContextChannel)cb.Value).State == CommunicationState.Opened); for (int i = 0; i < cbs.Count(); i++) { var cb = cbs.ElementAt(i).Value; try { cb.BeginReceiveGameData(gameData, _ReceiveGameDataCompleted, cb); } catch (TimeoutException texp) { //Log timeout error } catch (CommunicationException cexp) { //Log communication error } } lock (_Key) _Timer.Enabled = true; } private static void ReceiveGameDataCompleted(IAsyncResult result) { try { ((IGameStreamClient)(result.AsyncState)).EndReceiveGameData(result); } catch (CommunicationException) { // empty } catch (TimeoutException) { // empty } } LIsting 5. _Timer_Elapsed is used to simulate time in a basketball game. When _Timer_Elapsed() fires the SendGameData() method is called which iterates through the clients wanting to be notified of changes. As each client is identified, their respective BeginReceiveGameData() method is called which ultimately pushes game data down to the client. Recall that this method was defined in the client callback interface named IGameStreamClient shown earlier in Listing 2. Notice that BeginReceiveGameData() accepts _ReceiveGameDataCompleted as its second parameter (an AsyncCallback delegate defined in the service class) and passes the client callback as the third parameter. The initial version of the sample application had a standard ReceiveGameData() method in the client callback interface. However, sometimes the client callbacks would work properly and sometimes they wouldn’t which was a little baffling at first glance. After some investigation I realized that I needed to implement an asynchronous pattern for client callbacks to work properly since 3 – 7 second delays are occurring as a result of the timer. Once I added the BeginReceiveGameData() and ReceiveGameDataCompleted() methods everything worked properly since each call was handled in an asynchronous manner. The final task that had to be completed to get the server working properly with HTTP Polling Duplex was adding configuration code into web.config. In the interest of brevity I won’t post all of the code here since the sample application includes everything you need. However, Listing 6 shows the key configuration code to handle creating a custom binding named pollingDuplexBinding and associate it with the service’s endpoint.   <bindings> <customBinding> <binding name="pollingDuplexBinding"> <binaryMessageEncoding /> <pollingDuplex maxPendingSessions="2147483647" maxPendingMessagesPerSession="2147483647" inactivityTimeout="02:00:00" serverPollTimeout="00:05:00"/> <httpTransport /> </binding> </customBinding> </bindings> <services> <service name="GameService.GameStreamService" behaviorConfiguration="GameStreamServiceBehavior"> <endpoint address="" binding="customBinding" bindingConfiguration="pollingDuplexBinding" contract="GameService.IGameStreamService"/> <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange" /> </service> </services>   Listing 6. Configuring an HTTP Polling Duplex binding in web.config and associating an endpoint with it. Calling the Service and Receiving “Pushed” Data Calling the service and handling data that is pushed from the server is a simple and straightforward process in Silverlight. Since the service is configured with a MEX endpoint and exposes a WSDL file, you can right-click on the Silverlight project and select the standard Add Service Reference item. After the web service proxy is created you may notice that the ServiceReferences.ClientConfig file only contains an empty configuration element instead of the normal configuration elements created when creating a standard WCF proxy. You can certainly update the file if you want to read from it at runtime but for the sample application I fed the service URI directly to the service proxy as shown next: var address = new EndpointAddress("http://localhost.:5661/GameStreamService.svc"); var binding = new PollingDuplexHttpBinding(); _Proxy = new GameStreamServiceClient(binding, address); _Proxy.ReceiveTeamDataReceived += _Proxy_ReceiveTeamDataReceived; _Proxy.ReceiveGameDataReceived += _Proxy_ReceiveGameDataReceived; _Proxy.GetTeamDataAsync(); This code creates the proxy and passes the endpoint address and binding to use to its constructor. It then wires the different receive events to callback methods and calls GetTeamDataAsync().  Calling GetTeamDataAsync() causes the server to store the client in the server-side dictionary collection mentioned earlier so that it can receive data that is pushed.  As the server-side timer fires and game data is pushed to the client, the user interface is updated as shown in Listing 7. Listing 8 shows the _Proxy_ReceiveGameDataReceived() method responsible for handling the data and calling UpdateGameData() to process it.   Listing 7. The Silverlight interface. Game data is pushed from the server to the client using HTTP Polling Duplex. void _Proxy_ReceiveGameDataReceived(object sender, ReceiveGameDataReceivedEventArgs e) { UpdateGameData(e.gameData); } private void UpdateGameData(GameData gameData) { //Update Score this.tbTeam1Score.Text = gameData.Team1Score.ToString(); this.tbTeam2Score.Text = gameData.Team2Score.ToString(); //Update ball visibility if (gameData.Action != ActionsEnum.Foul) { if (tbTeam1.Text == gameData.TeamOnOffense) { AnimateBall(this.BB1, this.BB2); } else //Team 2 { AnimateBall(this.BB2, this.BB1); } } if (this.lbActions.Items.Count > 9) this.lbActions.Items.Clear(); this.lbActions.Items.Add(gameData.LastAction); if (this.lbActions.Visibility == Visibility.Collapsed) this.lbActions.Visibility = Visibility.Visible; } private void AnimateBall(Image onBall, Image offBall) { this.FadeIn.Stop(); Storyboard.SetTarget(this.FadeInAnimation, onBall); Storyboard.SetTarget(this.FadeOutAnimation, offBall); this.FadeIn.Begin(); } Listing 8. As the server pushes game data, the client’s _Proxy_ReceiveGameDataReceived() method is called to process the data. In a real-life application I’d go with a ViewModel class to handle retrieving team data, setup data bindings and handle data that is pushed from the server. However, for the sample application I wanted to focus on HTTP Polling Duplex and keep things as simple as possible.   Summary Silverlight supports three options when duplex communication is required in an application including TCP bindins, sockets and HTTP Polling Duplex. In this post you’ve seen how HTTP Polling Duplex interfaces can be created and implemented on the server as well as how they can be consumed by a Silverlight client. HTTP Polling Duplex provides a nice way to “push” data from a server while still allowing the data to flow over port 80 or another port of your choice.   Sample Application Download

    Read the article

  • Hugh Bin-Haad's SQL Rumours

    - by Paul Nielsen
    Insider rumours and gossip from the murky world of the Database Industry, and from the colourful characters that inhabit it http://www.simple-talk.com/sql/editors-corner/insider-insights/ Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • Guidance and Pricing for MSDN 2010

    - by John Alexander
    Sorry for the rather lengthy post here. I get asked this all the time so I decided to post it…Visual Studio 2010 editions will be available on April 12, 2010. Product Features Professional with MSDN Essentials Professional with MSDN Premium with MSDN Ultimate with MSDN Test Professional with MSDN Debugging and Diagnostics IntelliTrace (Historical Debugger)         Static Code Analysis       Code Metrics       Profiling       Debugger   Testing Tools Unit Testing   Code Coverage       Test Impact Analysis       Coded UI Test       Web Performance Testing         Load Testing1         Microsoft Test Manager 2010       Test Case Management2       Manual Test Execution       Fast-Forward for Manual Testing       Lab Management Configuration3       Integrated Development Environment Multiple Monitor Support   Multi-Targeting   One Click Web Deployment   JavaScript and jQuery Support   Extensible WPF-Based Environment Database Development Database Deployment       Database Change Management2       Database Unit Testing       Database Test Data Generation       Data Access   Development Platform Support Windows Development   Web Development   Office and SharePoint Development   Cloud Development   Customizable Development Experience   Architecture and Modeling Architecture Explorer         UML® 2.0 Compliant Diagrams (Activity, Use Case, Sequence, Class, Component)         Layer Diagram and Dependency Validation         Read-only diagrams (UML, Layer, DGML Graphs)         Lab Management Virtual environment setup & tear down3       Provision environment from template3       Checkpoint environment3       Team Foundation Server Version Control2   Work Item Tracking2   Build Automation2   Team Portal2   Reporting & Business Intelligence2   Agile Planning Workbook2   Microsoft Visual Studio Team Explorer 2010   Test Case Management2       MSDN Subscription – Software and Services for Production Use Windows Azure Platform 20 hrs/mo † 50 hrs/mo † 100 hrs/mo † 250 hrs/mo † n/a Microsoft Visual Studio Team Foundation Server 2010   Microsoft Visual Studio Team Foundation Server 2010 CAL   1 1 1 1 Microsoft Expression Studio 3       Microsoft Office Professional Plus 2010, Project Professional 2010, Visio Premium 2010 (following Office 2010 launch)       MSDN Subscription – Software for Development and Testing 4 Windows 7, Windows Server 2008 R2 and SQL Server 2008 Toolkits, Software Development Kits, Driver Development Kits Previous versions of Windows (client and server operation systems)   Previous versions of Microsoft SQL Server   Microsoft Office       Microsoft Dynamics       All other Servers       Windows Embedded operating systems       Teamprise         MSDN Subscription – Other Benefits Technical support incidents 0 2 4 4 2 Priority support in MSDN Forums Microsoft e-learning collections (typically 10 courses or 20 hours) 0 1 2 2 1 MSDN Flash newsletter MSDN Online Concierge MSDN Magazine   System Requirements View View View View View Buy from (MSRP) $799 $1,199 $5,469 $11,899 $2,169 Renew from (MSRP) $549 (upgrade) $799 $2,299 $3,799 $899 † Availability varies by country and subscription level.  Details available on the MSDN site 1. May require one or more Microsoft Visual Studio Load Test Virtual User Pack 2010 2. Requires Team Foundation Server and a Team Foundation Server CAL 3. Requires Microsoft Visual Studio Lab Management 2010 4. Per-user license allows unlimited installations and use for designing, developing, testing, and demonstrating applications. UML is a registered trademark of Object Management Group, Inc. Windows is either a registered trademark or trademark of Microsoft Corporation in the United States and/or other countries.

    Read the article

  • New DMV… not yet

    - by Michael Zilberstein
    Downloaded and installed new toy: And while reading BOL, stumbled upon new extremely useful DMV: sys.dm_exec_query_profiles . This DMV enables DBA to monitor query progress while it is being executed. Counters in the DMV are per operation per thread. So we’ll be able to monitor in real time which thread (even for parallel processing) processes which node in the plan. Or find heavy operations “post mortem”. We all know the uncomfortable feeling when some heavy query runs and the boss starts asking...(read more)

    Read the article

  • Global Cache CR Requested But Current Block Received

    - by Liu Maclean(???)
    ????????«MINSCN?Cache Fusion Read Consistent» ????,???????????? ??????????????????: SQL> select * from V$version; BANNER -------------------------------------------------------------------------------- Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production PL/SQL Release 11.2.0.3.0 - Production CORE 11.2.0.3.0 Production TNS for Linux: Version 11.2.0.3.0 - Production NLSRTL Version 11.2.0.3.0 - Production SQL> select count(*) from gv$instance; COUNT(*) ---------- 2 SQL> select * from global_name; GLOBAL_NAME -------------------------------------------------------------------------------- www.oracledatabase12g.com ?11gR2 2??RAC??????????status???XG,????Xcurrent block???INSTANCE 2?hold?,?????INSTANCE 1?????????,?????: SQL> select * from test; ID ---------- 1 2 SQL> select dbms_rowid.rowid_block_number(rowid),dbms_rowid.rowid_relative_fno(rowid) from test; DBMS_ROWID.ROWID_BLOCK_NUMBER(ROWID) DBMS_ROWID.ROWID_RELATIVE_FNO(ROWID) ------------------------------------ ------------------------------------ 89233 1 89233 1 SQL> alter system flush buffer_cache; System altered. INSTANCE 1 Session A: SQL> update test set id=id+1 where id=1; 1 row updated. INSTANCE 1 Session B: SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 1 0 3 1755287 SQL> oradebug setmypid; Statement processed. SQL> oradebug dump gc_elements 255; Statement processed. SQL> oradebug tracefile_name; /s01/orabase/diag/rdbms/vprod/VPROD1/trace/VPROD1_ora_19111.trc GLOBAL CACHE ELEMENT DUMP (address: 0xa4ff3080): id1: 0x15c91 id2: 0x1 pkey: OBJ#76896 block: (1/89233) lock: X rls: 0x0 acq: 0x0 latch: 3 flags: 0x20 fair: 0 recovery: 0 fpin: 'kdswh11: kdst_fetch' bscn: 0x0.146e20 bctx: (nil) write: 0 scan: 0x0 lcp: (nil) lnk: [NULL] lch: [0xa9f6a6f8,0xa9f6a6f8] seq: 32 hist: 58 145:0 118 66 144:0 192 352 197 48 121 113 424 180 58 LIST OF BUFFERS LINKED TO THIS GLOBAL CACHE ELEMENT: flg: 0x02000001 lflg: 0x1 state: XCURRENT tsn: 0 tsh: 2 addr: 0xa9f6a5c8 obj: 76896 cls: DATA bscn: 0x0.1ac898 BH (0xa9f6a5c8) file#: 1 rdba: 0x00415c91 (1/89233) class: 1 ba: 0xa9e56000 set: 5 pool: 3 bsz: 8192 bsi: 0 sflg: 3 pwc: 0,15 dbwrid: 0 obj: 76896 objn: 76896 tsn: 0 afn: 1 hint: f hash: [0x91f4e970,0xbae9d5b8] lru: [0x91f58848,0xa9f6a828] lru-flags: debug_dump obj-flags: object_ckpt_list ckptq: [0x9df6d1d8,0xa9f6a740] fileq: [0xa2ece670,0xbdf4ed68] objq: [0xb4964e00,0xb4964e00] objaq: [0xb4964de0,0xb4964de0] st: XCURRENT md: NULL fpin: 'kdswh11: kdst_fetch' tch: 2 le: 0xa4ff3080 flags: buffer_dirty redo_since_read LRBA: [0x19.5671.0] LSCN: [0x0.1ac898] HSCN: [0x0.1ac898] HSUB: [1] buffer tsn: 0 rdba: 0x00415c91 (1/89233) scn: 0x0000.001ac898 seq: 0x01 flg: 0x00 tail: 0xc8980601 frmt: 0x02 chkval: 0x0000 type: 0x06=trans data ??????block: (1/89233)?GLOBAL CACHE ELEMENT DUMP?LOCK????X ??XG , ??????Current Block????Instance??modify???,????????????? ????Instance 2 ????: Instance 2 Session C: SQL> update test set id=id+1 where id=2; 1 row updated. Instance 2 Session D: SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 1 0 3 1756658 SQL> oradebug setmypid; Statement processed. SQL> oradebug dump gc_elements 255; Statement processed. SQL> oradebug tracefile_name; /s01/orabase/diag/rdbms/vprod/VPROD2/trace/VPROD2_ora_13038.trc GLOBAL CACHE ELEMENT DUMP (address: 0x89fb25a0): id1: 0x15c91 id2: 0x1 pkey: OBJ#76896 block: (1/89233) lock: XG rls: 0x0 acq: 0x0 latch: 3 flags: 0x20 fair: 0 recovery: 0 fpin: 'kduwh01: kdusru' bscn: 0x0.1acdf3 bctx: (nil) write: 0 scan: 0x0 lcp: (nil) lnk: [NULL] lch: [0x96f4cf80,0x96f4cf80] seq: 61 hist: 324 21 143:0 19 16 352 329 144:6 14 7 352 197 LIST OF BUFFERS LINKED TO THIS GLOBAL CACHE ELEMENT: flg: 0x0a000001 state: XCURRENT tsn: 0 tsh: 1 addr: 0x96f4ce50 obj: 76896 cls: DATA bscn: 0x0.1acdf6 BH (0x96f4ce50) file#: 1 rdba: 0x00415c91 (1/89233) class: 1 ba: 0x96bd4000 set: 5 pool: 3 bsz: 8192 bsi: 0 sflg: 2 pwc: 0,15 dbwrid: 0 obj: 76896 objn: 76896 tsn: 0 afn: 1 hint: f hash: [0x96ee1fe8,0xbae9d5b8] lru: [0x96f4d0b0,0x96f4cdc0] obj-flags: object_ckpt_list ckptq: [0xbdf519b8,0x96f4d5a8] fileq: [0xbdf519d8,0xbdf519d8] objq: [0xb4a47b90,0xb4a47b90] objaq: [0x96f4d0e8,0xb4a47b70] st: XCURRENT md: NULL fpin: 'kduwh01: kdusru' tch: 1 le: 0x89fb25a0 flags: buffer_dirty redo_since_read remote_transfered LRBA: [0x11.9e18.0] LSCN: [0x0.1acdf6] HSCN: [0x0.1acdf6] HSUB: [1] buffer tsn: 0 rdba: 0x00415c91 (1/89233) scn: 0x0000.001acdf6 seq: 0x01 flg: 0x00 tail: 0xcdf60601 frmt: 0x02 chkval: 0x0000 type: 0x06=trans data GCS CLIENT 0x89fb2618,6 resp[(nil),0x15c91.1] pkey 76896.0 grant 2 cvt 0 mdrole 0x42 st 0x100 lst 0x20 GRANTQ rl G0 master 1 owner 2 sid 0 remote[(nil),0] hist 0x94121c601163423c history 0x3c.0x4.0xd.0xb.0x1.0xc.0x7.0x9.0x14.0x1. cflag 0x0 sender 1 flags 0x0 replay# 0 abast (nil).x0.1 dbmap (nil) disk: 0x0000.00000000 write request: 0x0000.00000000 pi scn: 0x0000.00000000 sq[(nil),(nil)] msgseq 0x1 updseq 0x0 reqids[6,0,0] infop (nil) lockseq x2b8 pkey 76896.0 hv 93 [stat 0x0, 1->1, wm 32768, RMno 0, reminc 18, dom 0] kjga st 0x4, step 0.0.0, cinc 20, rmno 6, flags 0x0 lb 0, hb 0, myb 15250, drmb 15250, apifrz 0 ?Instance 2??????block: (1/89233)? GLOBAL CACHE ELEMENT Lock Convert?lock: XG ????GC_ELEMENTS DUMP???XCUR Cache Fusion?,???????X$ VIEW,??? X$LE X$KJBR X$KJBL, ???X$ VIEW???????????????????: INSTANCE 2 Session D: SELECT * FROM x$le WHERE le_addr IN (SELECT le_addr FROM x$bh WHERE obj IN (SELECT data_object_id FROM dba_objects WHERE owner = 'SYS' AND object_name = 'TEST') AND class = 1 AND state != 3); ADDR INDX INST_ID LE_ADDR LE_ID1 LE_ID2 ---------------- ---------- ---------- ---------------- ---------- ---------- LE_RLS LE_ACQ LE_FLAGS LE_MODE LE_WRITE LE_LOCAL LE_RECOVERY ---------- ---------- ---------- ---------- ---------- ---------- ----------- LE_BLKS LE_TIME LE_KJBL ---------- ---------- ---------------- 00007F94CA14CF60 7003 2 0000000089FB25A0 89233 1 0 0 32 2 0 1 0 1 0 0000000089FB2618 PCM Resource NAME?[ID1][ID2],[BL]???, ID1?ID2 ??blockno? fileno????, ??????????GC_elements dump?? id1: 0x15c91 id2: 0×1 pkey: OBJ#76896 block: (1/89233)?? ,?  kjblname ? kjbrname ??”[0x15c91][0x1],[BL]” ??: INSTANCE 2 Session D: SQL> set linesize 80 pagesize 1400 SQL> SELECT * 2 FROM x$kjbl l 3 WHERE l.kjblname LIKE '%[0x15c91][0x1],[BL]%'; ADDR INDX INST_ID KJBLLOCKP KJBLGRANT KJBLREQUE ---------------- ---------- ---------- ---------------- --------- --------- KJBLROLE KJBLRESP KJBLNAME ---------- ---------------- ------------------------------ KJBLNAME2 KJBLQUEUE ------------------------------ ---------- KJBLLOCKST KJBLWRITING ---------------------------------------------------------------- ----------- KJBLREQWRITE KJBLOWNER KJBLMASTER KJBLBLOCKED KJBLBLOCKER KJBLSID KJBLRDOMID ------------ ---------- ---------- ----------- ----------- ---------- ---------- KJBLPKEY ---------- 00007F94CA22A288 451 2 0000000089FB2618 KJUSEREX KJUSERNL 0 00 [0x15c91][0x1],[BL][ext 0x0,0x 89233,1,BL 0 GRANTED 0 0 1 0 0 0 0 0 76896 SQL> SELECT r.* FROM x$kjbr r WHERE r.kjbrname LIKE '%[0x15c91][0x1],[BL]%'; no rows selected Instance 1 session B: SQL> SELECT r.* FROM x$kjbr r WHERE r.kjbrname LIKE '%[0x15c91][0x1],[BL]%'; ADDR INDX INST_ID KJBRRESP KJBRGRANT KJBRNCVL ---------------- ---------- ---------- ---------------- --------- --------- KJBRROLE KJBRNAME KJBRMASTER KJBRGRANTQ ---------- ------------------------------ ---------- ---------------- KJBRCVTQ KJBRWRITER KJBRSID KJBRRDOMID KJBRPKEY ---------------- ---------------- ---------- ---------- ---------- 00007F801ACA68F8 1355 1 00000000B5A62AE0 KJUSEREX KJUSERNL 0 [0x15c91][0x1],[BL][ext 0x0,0x 0 00000000B48BB330 00 00 0 0 76896 ??????Instance 1???block: (1/89233),??????Instance 2 build cr block ????Instance 1, ?????????? ????? Instance 1? Foreground Process ? Instance 2?LMS??????RAC  TRACE: Instance 2: [oracle@vrh2 ~]$ ps -ef|grep ora_lms|grep -v grep oracle 23364 1 0 Apr29 ? 00:33:15 ora_lms0_VPROD2 SQL> oradebug setospid 23364 Oracle pid: 13, Unix process pid: 23364, image: [email protected] (LMS0) SQL> oradebug event 10046 trace name context forever,level 8:10708 trace name context forever,level 103: trace[rac.*] disk high; Statement processed. SQL> oradebug tracefile_name /s01/orabase/diag/rdbms/vprod/VPROD2/trace/VPROD2_lms0_23364.trc Instance 1 session B : SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1756658 3 1756661 3 1755287 Instance 1 session A : SQL> alter session set events '10046 trace name context forever,level 8:10708 trace name context forever,level 103: trace[rac.*] disk high'; Session altered. SQL> select * from test; ID ---------- 2 2 SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1761520 ?x$BH?????,???????Instance 1???build??CR block,????? TRACE ??: Instance 1 foreground Process: PARSING IN CURSOR #140336527348792 len=18 dep=0 uid=0 oct=3 lid=0 tim=1335939136125254 hv=1689401402 ad='b1a4c828' sqlid='c99yw1xkb4f1u' select * from test END OF STMT PARSE #140336527348792:c=2999,e=2860,p=0,cr=0,cu=0,mis=1,r=0,dep=0,og=1,plh=1357081020,tim=1335939136125253 EXEC #140336527348792:c=0,e=40,p=0,cr=0,cu=0,mis=0,r=0,dep=0,og=1,plh=1357081020,tim=1335939136125373 WAIT #140336527348792: nam='SQL*Net message to client' ela= 6 driver id=1650815232 #bytes=1 p3=0 obj#=0 tim=1335939136125420 *** 2012-05-02 02:12:16.125 kclscrs: req=0 block=1/89233 2012-05-02 02:12:16.125574 : kjbcro[0x15c91.1 76896.0][4] *** 2012-05-02 02:12:16.125 kclscrs: req=0 typ=nowait-abort *** 2012-05-02 02:12:16.125 kclscrs: bid=1:3:1:0:f:1e:0:0:10:0:0:0:1:2:4:1:20:0:0:0:c3:49:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:4:3:2:1:2:0:1c:0:4d:26:a3:52:0:0:0:0:c7:c:ca:62:c3:49:0:0:0:0:1:0:14:8e:47:76:1:2:dc:5:a9:fe:17:75:0:0:0:0:0:0:0:0:0:0:0:0:99:ed:0:0:0:0:0:0:10:0:0:0 2012-05-02 02:12:16.125718 : kjbcro[0x15c91.1 76896.0][4] 2012-05-02 02:12:16.125751 : GSIPC:GMBQ: buff 0xba0ee018, queue 0xbb79a7b8, pool 0x60013fa0, freeq 0, nxt 0xbb79a7b8, prv 0xbb79a7b8 2012-05-02 02:12:16.125780 : kjbsentscn[0x0.1ae0f0][to 2] 2012-05-02 02:12:16.125806 : GSIPC:SENDM: send msg 0xba0ee088 dest x20001 seq 177740 type 36 tkts xff0000 mlen x1680198 2012-05-02 02:12:16.125918 : kjbmscr(0x15c91.1)reqid=0x8(req 0xa4ff30f8)(rinst 1)hldr 2(infosz 200)(lseq x2b8) 2012-05-02 02:12:16.126959 : GSIPC:KSXPCB: msg 0xba0ee088 status 30, type 36, dest 2, rcvr 1 *** 2012-05-02 02:12:16.127 kclwcrs: wait=0 tm=1233 *** 2012-05-02 02:12:16.127 kclwcrs: got 1 blocks from ksxprcv WAIT #140336527348792: nam='gc cr block 2-way' ela= 1233 p1=1 p2=89233 p3=1 obj#=76896 tim=1335939136127199 2012-05-02 02:12:16.127272 : kjbcrcomplete[0x15c91.1 76896.0][0] 2012-05-02 02:12:16.127309 : kjbrcvdscn[0x0.1ae0f0][from 2][idx 2012-05-02 02:12:16.127329 : kjbrcvdscn[no bscn <= rscn 0x0.1ae0f0][from 2] ???? kjbcro[0x15c91.1 76896.0][4] kjbsentscn[0x0.1ae0f0][to 2] ?Instance 2??SCN=1ae0f0=1761520? block: (1/89233),???’gc cr block 2-way’ ??,?????????CR block? Instance 2 LMS TRACE 2012-05-02 02:12:15.634057 : GSIPC:RCVD: ksxp msg 0x7f16e1598588 sndr 1 seq 0.177740 type 36 tkts 0 2012-05-02 02:12:15.634094 : GSIPC:RCVD: watq msg 0x7f16e1598588 sndr 1, seq 177740, type 36, tkts 0 2012-05-02 02:12:15.634108 : GSIPC:TKT: collect msg 0x7f16e1598588 from 1 for rcvr -1, tickets 0 2012-05-02 02:12:15.634162 : kjbrcvdscn[0x0.1ae0f0][from 1][idx 2012-05-02 02:12:15.634186 : kjbrcvdscn[no bscn1, wm 32768, RMno 0, reminc 18, dom 0] kjga st 0x4, step 0.0.0, cinc 20, rmno 6, flags 0x0 lb 0, hb 0, myb 15250, drmb 15250, apifrz 0 GCS CLIENT END 2012-05-02 02:12:15.635211 : kjbdowncvt[0x15c91.1 76896.0][1][options x0] 2012-05-02 02:12:15.635230 : GSIPC:AMBUF: rcv buff 0x7f16e1c56420, pool rcvbuf, rqlen 1103 2012-05-02 02:12:15.635308 : GSIPC:GPBMSG: new bmsg 0x7f16e1c56490 mb 0x7f16e1c56420 msg 0x7f16e1c564b0 mlen 152 dest x101 flushsz -1 2012-05-02 02:12:15.635334 : kjbmslset(0x15c91.1)) seq 0x4 reqid=0x6 (shadow 0xb48bb330.xb)(rsn 2)(mas@1) 2012-05-02 02:12:15.635355 : GSIPC:SPBMSG: send bmsg 0x7f16e1c56490 blen 184 msg 0x7f16e1c564b0 mtype 33 attr|dest x30101 bsz|fsz x1ffff 2012-05-02 02:12:15.635377 : GSIPC:SNDQ: enq msg 0x7f16e1c56490, type 65521 seq 118669, inst 1, receiver 1, queued 1 *** 2012-05-02 02:12:15.635 kclccctx: cleanup copy 0x7f16e1d94798 2012-05-02 02:12:15.635479 : [kjmpmsgi:compl][type 36][msg 0x7f16e1598588][seq 177740.0][qtime 0][ptime 1257] 2012-05-02 02:12:15.635511 : GSIPC:BSEND: flushing sndq 0xb491dd28, id 1, dcx 0xbc516778, inst 1, rcvr 1 qlen 0 1 2012-05-02 02:12:15.635536 : GSIPC:BSEND: no batch1 msg 0x7f16e1c56490 type 65521 len 184 dest (1:1) 2012-05-02 02:12:15.635557 : kjbsentscn[0x0.1ae0f1][to 1] 2012-05-02 02:12:15.635578 : GSIPC:SENDM: send msg 0x7f16e1c56490 dest x10001 seq 118669 type 65521 tkts x10002 mlen xb800e8 WAIT #0: nam='gcs remote message' ela= 180 waittime=1 poll=0 event=0 obj#=0 tim=1335939135635819 2012-05-02 02:12:15.635853 : GSIPC:RCVD: ksxp msg 0x7f16e167e0b0 sndr 1 seq 0.177741 type 32 tkts 0 2012-05-02 02:12:15.635875 : GSIPC:RCVD: watq msg 0x7f16e167e0b0 sndr 1, seq 177741, type 32, tkts 0 2012-05-02 02:12:15.636012 : GSIPC:TKT: collect msg 0x7f16e167e0b0 from 1 for rcvr -1, tickets 0 2012-05-02 02:12:15.636040 : kjbrcvdscn[0x0.1ae0f1][from 1][idx 2012-05-02 02:12:15.636060 : kjbrcvdscn[no bscn <= rscn 0x0.1ae0f1][from 1] 2012-05-02 02:12:15.636082 : GSIPC:TKT: dest (1:1) rtkt not acked 1  unassigned bufs 0  tkts 0  newbufs 0 2012-05-02 02:12:15.636102 : GSIPC:TKT: remove ctx dest (1:1) 2012-05-02 02:12:15.636125 : [kjmxmpm][type 32][seq 0.177741][msg 0x7f16e167e0b0][from 1] 2012-05-02 02:12:15.636146 : kjbmpocr(0xb0.6)seq 0x1,reqid=0x23a,(client 0x9fff7b58,0x1)(from 1)(lseq xdf0) 2????LMS????????? ??gcs remote message GSIPC ????SCN=[0x0.1ae0f0] block=1/89233???,??BAST kjbmpbast(0x15c91.1),?? block=1/89233??????? ??fairness??(?11.2.0.3???_fairness_threshold=2),?current block?KCL: F156: fairness downconvert,?Xcurrent DownConvert? Scurrent: Instance 2: SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 2 0 3 1756658 ??Instance 2 LMS ?cr block??? kjbmslset(0x15c91.1)) ????SEND QUEUE GSIPC:SNDQ: enq msg 0x7f16e1c56490? ???????Instance 1???? block: (1/89233)??? ??????: Instance 2: SQL> select CURRENT_RESULTS,LIGHT_WORKS from v$cr_block_server; CURRENT_RESULTS LIGHT_WORKS --------------- ----------- 29273 437 Instance 1 session A: SQL> SQL> select * from test; ID ---------- 2 2 SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1761942 3 1761932 1 0 3 1761520 Instance 2: SQL> select CURRENT_RESULTS,LIGHT_WORKS from v$cr_block_server; CURRENT_RESULTS LIGHT_WORKS --------------- ----------- 29274 437 select * from test END OF STMT PARSE #140336529675592:c=0,e=337,p=0,cr=0,cu=0,mis=0,r=0,dep=0,og=1,plh=1357081020,tim=1335939668940051 EXEC #140336529675592:c=0,e=96,p=0,cr=0,cu=0,mis=0,r=0,dep=0,og=1,plh=1357081020,tim=1335939668940204 WAIT #140336529675592: nam='SQL*Net message to client' ela= 5 driver id=1650815232 #bytes=1 p3=0 obj#=0 tim=1335939668940348 *** 2012-05-02 02:21:08.940 kclscrs: req=0 block=1/89233 2012-05-02 02:21:08.940676 : kjbcro[0x15c91.1 76896.0][5] *** 2012-05-02 02:21:08.940 kclscrs: req=0 typ=nowait-abort *** 2012-05-02 02:21:08.940 kclscrs: bid=1:3:1:0:f:21:0:0:10:0:0:0:1:2:4:1:20:0:0:0:c3:49:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:4:3:2:1:2:0:1f:0:4d:26:a3:52:0:0:0:0:c7:c:ca:62:c3:49:0:0:0:0:1:0:17:8e:47:76:1:2:dc:5:a9:fe:17:75:0:0:0:0:0:0:0:0:0:0:0:0:99:ed:0:0:0:0:0:0:10:0:0:0 2012-05-02 02:21:08.940799 : kjbcro[0x15c91.1 76896.0][5] 2012-05-02 02:21:08.940833 : GSIPC:GMBQ: buff 0xba0ee018, queue 0xbb79a7b8, pool 0x60013fa0, freeq 0, nxt 0xbb79a7b8, prv 0xbb79a7b8 2012-05-02 02:21:08.940859 : kjbsentscn[0x0.1ae28c][to 2] 2012-05-02 02:21:08.940870 : GSIPC:SENDM: send msg 0xba0ee088 dest x20001 seq 177810 type 36 tkts xff0000 mlen x1680198 2012-05-02 02:21:08.940976 : kjbmscr(0x15c91.1)reqid=0xa(req 0xa4ff30f8)(rinst 1)hldr 2(infosz 200)(lseq x2b8) 2012-05-02 02:21:08.941314 : GSIPC:KSXPCB: msg 0xba0ee088 status 30, type 36, dest 2, rcvr 1 *** 2012-05-02 02:21:08.941 kclwcrs: wait=0 tm=707 *** 2012-05-02 02:21:08.941 kclwcrs: got 1 blocks from ksxprcv 2012-05-02 02:21:08.941818 : kjbassume[0x15c91.1][sender 2][mymode x1][myrole x0][srole x0][flgs x0][spiscn 0x0.0][swscn 0x0.0] 2012-05-02 02:21:08.941852 : kjbrcvdscn[0x0.1ae28d][from 2][idx 2012-05-02 02:21:08.941871 : kjbrcvdscn[no bscn ??????????????SCN=[0x0.1ae28c]=1761932 Version?CR block, ????receive????Xcurrent Block??SCN=1ae28d=1761933,Instance 1???Xcurrent Block???build????????SCN=1761932?CR BLOCK, ????????Current block,?????????'gc current block 2-way'? ?????????????request current block,?????kjbcro;?????Instance 2?LMS???????Current Block: Instance 2 LMS trace: 2012-05-02 02:21:08.448743 : GSIPC:RCVD: ksxp msg 0x7f16e14a4398 sndr 1 seq 0.177810 type 36 tkts 0 2012-05-02 02:21:08.448778 : GSIPC:RCVD: watq msg 0x7f16e14a4398 sndr 1, seq 177810, type 36, tkts 0 2012-05-02 02:21:08.448798 : GSIPC:TKT: collect msg 0x7f16e14a4398 from 1 for rcvr -1, tickets 0 2012-05-02 02:21:08.448816 : kjbrcvdscn[0x0.1ae28c][from 1][idx 2012-05-02 02:21:08.448834 : kjbrcvdscn[no bscn <= rscn 0x0.1ae28c][from 1] 2012-05-02 02:21:08.448857 : GSIPC:TKT: dest (1:1) rtkt not acked 2  unassigned bufs 0  tkts 0  newbufs 0 2012-05-02 02:21:08.448875 : GSIPC:TKT: remove ctx dest (1:1) 2012-05-02 02:21:08.448970 : [kjmxmpm][type 36][seq 0.177810][msg 0x7f16e14a4398][from 1] 2012-05-02 02:21:08.448993 : kjbmpbast(0x15c91.1) reqid=0x6 (req 0xa4ff30f8)(reqinst 1)(reqid 10)(flags x0) *** 2012-05-02 02:21:08.449 kclcrrf: req=48054 block=1/89233 *** 2012-05-02 02:21:08.449 kcl_compress_block: compressed: 6 free space: 7680 2012-05-02 02:21:08.449085 : kjbsentscn[0x0.1ae28d][to 1] 2012-05-02 02:21:08.449142 : kjbdeliver[to 1][0xa4ff30f8][10][current 1] 2012-05-02 02:21:08.449164 : kjbmssch(reqlock 0xa4ff30f8,10)(to 1)(bsz 344) 2012-05-02 02:21:08.449183 : GSIPC:AMBUF: rcv buff 0x7f16e18bcec8, pool rcvbuf, rqlen 1102 *** 2012-05-02 02:21:08.449 kclccctx: cleanup copy 0x7f16e1d94838 *** 2012-05-02 02:21:08.449 kcltouched: touch seconds 3271 *** 2012-05-02 02:21:08.449 kclgrantlk: req=48054 2012-05-02 02:21:08.449347 : [kjmpmsgi:compl][type 36][msg 0x7f16e14a4398][seq 177810.0][qtime 0][ptime 1119] WAIT #0: nam='gcs remote message' ela= 568 waittime=1 poll=0 event=0 obj#=0 tim=1335939668449962 2012-05-02 02:21:08.450001 : GSIPC:RCVD: ksxp msg 0x7f16e1bb22a0 sndr 1 seq 0.177811 type 32 tkts 0 2012-05-02 02:21:08.450024 : GSIPC:RCVD: watq msg 0x7f16e1bb22a0 sndr 1, seq 177811, type 32, tkts 0 2012-05-02 02:21:08.450043 : GSIPC:TKT: collect msg 0x7f16e1bb22a0 from 1 for rcvr -1, tickets 0 2012-05-02 02:21:08.450060 : kjbrcvdscn[0x0.1ae28e][from 1][idx 2012-05-02 02:21:08.450078 : kjbrcvdscn[no bscn <= rscn 0x0.1ae28e][from 1] 2012-05-02 02:21:08.450097 : GSIPC:TKT: dest (1:1) rtkt not acked 3  unassigned bufs 0  tkts 0  newbufs 0 2012-05-02 02:21:08.450116 : GSIPC:TKT: remove ctx dest (1:1) 2012-05-02 02:21:08.450136 : [kjmxmpm][type 32][seq 0.177811][msg 0x7f16e1bb22a0][from 1] 2012-05-02 02:21:08.450155 : kjbmpocr(0xb0.6)seq 0x1,reqid=0x23e,(client 0x9fff7b58,0x1)(from 1)(lseq xdf4) ???Instance 2??LMS???,???build cr block,??????Instance 1?????Current Block??????Instance 2??v$cr_block_server??????LIGHT_WORKS?????current block transfer??????,??????? CR server? Light Work Rule(Light Work Rule?8i Cr Server?????????,?Remote LMS?? build CR????????,resource holder?LMS???????block,????CR build If creating the consistent read version block involves too much work (such as reading blocks from disk), then the holder sends the block to the requestor, and the requestor completes the CR fabrication. The holder maintains a fairness counter of CR requests. After the fairness threshold is reached, the holder downgrades it to lock mode.)? ??????? CR Request ????Current Block?? ???:??????class?block,CR server??????? ??undo block?? undo header block?CR quest, LMS????Current Block, ????? ???? ??????? block cleanout? CR  Version??????? ???????? data blocks, ??????? CR quest  & CR received?(???????Light Work Rule,LMS"??"), ??Current Block??DownConvert???S lock,??LMS???????ship??current version?block? ??????? , ?????? ,???????DownConvert?????”_fairness_threshold“???200,????Xcurrent Block?????Scurrent, ????LMS?????Current Version?Data Block: SQL> show parameter fair NAME TYPE VALUE ------------------------------------ ----------- ------------------------------ _fairness_threshold integer 200 Instance 1: SQL> update test set id=id+1 where id=4; 1 row updated. Instance 2: SQL> update test set id=id+1 where id=2; 1 row updated. SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 1 0 3 1838166 ?Instance 1? ????,? ??instance 2? v$cr_block_server?? instance 1 SQL> select * from test; ID ---------- 10 3 instance 2: SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 1 0 3 1883707 8 0 SQL> select * from test; ID ---------- 10 3 SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 1 0 3 1883707 8 0 ................... SQL> / STATE CR_SCN_BAS ---------- ---------- 2 0 3 1883707 3 1883695 repeat cr request on Instance 1 SQL> / STATE CR_SCN_BAS ---------- ---------- 8 0 3 1883707 3 1883695 ??????_fairness_threshold????????,?????200 ????????CR serve??Downgrade?lock, ????data block? CR Request????Receive? Current Block?

    Read the article

< Previous Page | 480 481 482 483 484 485 486 487 488 489 490 491  | Next Page >