Search Results

Search found 30316 results on 1213 pages for 'sql doc'.

Page 499/1213 | < Previous Page | 495 496 497 498 499 500 501 502 503 504 505 506  | Next Page >

  • Finding Most Recent Order for a Particular Item

    - by visitor
    I'm trying to write a SQL Query for DB2 Version 8 which retrieves the most recent order of a specific part for a list of users. The query receives a parameter which contains a list of customerId numbers and the partId number. For example, Order Table OrderID PartID CustomerID OrderTime I initially wanted to try: Select * from Order where OrderId = ( Select orderId from Order where partId = #requestedPartId# and customerId = #customerId# Order by orderTime desc fetch first 1 rows only ); The problem with the above query is that it only works for a single user and my query needs to include multiple users. Does anyone have a suggestion about how I could expand the above query to work for multiple users? If I remove my "fetch first 1 rows only," then it will return all rows instead of the most recent. I also tried using Max(OrderTime), but I couldn't find a way to return the OrderId from the sub-select. Thanks! Note: DB2 Version 8 does not support the SQL "TOP" function.

    Read the article

  • How to predict result set row count?

    - by Saurabh Kumar
    I have an application where I create a big SQL query dynamically for SQL server 2008. This query is based on various search criteria which the user might give such as search by lastname, firstname, ssn etc. The requirement is that if the user gives a condition due to which the formed query might return a lot of rows(configurable for max N rows), then the application must send back a message instead to the user saying that he needs to refine his search query as the existing query will return too many rows. I would not want to bring back say, 5000 rows to the client and then discard that data just to show the user an error. What is an efficient way to tackle this issue?

    Read the article

  • Check If Stored Procedure Returns Value

    - by Eric
    Hello all, I am using Linq 2 Sql in VS 2010, and I have the following stored procedure to check a username and password ALTER PROCEDURE dbo.CheckUser ( @username varchar(50), @password varchar(50) ) AS SELECT * FROM Users Where UserName=@username AND Password=@password The problem I'm having is that it throws an exception if the username and password are incorrect. I'd like to perform a check to see if there is a return value, rather than using try/catch to determine whether the procedure returned a value. Should I do this check in code (C#)? Or is there a way to do it in SQL? Thanks.

    Read the article

  • issue with selecting 1 or 0 in mysql db

    - by jason
    I am having trouble with the following SQL statement i have had this issue before but I cant remember how i fixed the problem, I am guessing the issue with this is that mysql sees 0 as null ? Note I didnt show the first part of the Select statement as its irrelevant my sql works, its showing rows with toffline that are = to 1 as well as 0... FROM table1 AS tb1 LEFT JOIN table2 AS tb2 ON tb2.id = tb1.id2 LEFT JOIN table3 AS tb3 ON tb3.id = tb1.id3 AND tb1.toffline = 0 I have also tried AND NOT tb1.toffline = 1 also WHERE tb1.toffline = 0 all with the same result...

    Read the article

  • Using a database with C#

    - by Mike
    I have been trying to do something that I think would be really easy but have never used C# before and am having trouble with the details. I simple want to use a sql database with Visual C# Express 2008. For testing purposes I have a datagrid on my form that can reflect changes to the db. If i use this: codesTableAdapter.Fill(dataSet1.codes); The datagrid(dataset) will fill with the info from the sql database. If i then do something like this: codesTableAdapter.InsertQuery(txtCode.Text,txtName.Text); codesTableAdapter.Fill(dataSet1.codes); codesTableAdapter.Update(dataSet1); dataSet1.AcceptChanges(); The datagrid reflects the changes but if close the program and go to the database the changes are not there. When I open the program again the changes are not there. I have a feeling this isn't too clear as my understanding here is very low so please let me know what other info is needed. Thanks

    Read the article

  • Multiple &(AND) fails in query

    - by N e w B e e
    here is my query $sql = 'SELECT * FROM Orders INNER JOIN [Order Details] ON Orders.OrderNumber = [Order Details].OrderNumber WHERE Orders.CartID =2 AND [Order Details].Option10 Is Null AND [Order Details].Status="Shipped"'; this queries when entered in MS_Access sql view, returns the correct results, but when I copy and paste the same query in my php script, it fails and gives the error Too few parameters, expected 1... although data is there, query is working in access... Please note if I omitted on AND condition, it works eg if I removed shipped conidtion or is null condition, it works then too.. any hint? whats wrong with it?? any help?thanks

    Read the article

  • Answers to “What source control system do you use?” (and some winners)

    - by jamiet
    About a month ago I posed a question here on my blog SQL Server devs–what source control system do you use, if any? (answer and maybe win free stuff) in which I asked SQL Server developers to answer the following questions: Are you putting your SQL Server code into a source control system? If so, what source control server software (e.g. TFS, Git, SVN, Mercurial, SourceSafe, Perforce) are you using? What source control client software are you using (e.g. TFS Team Explorer, Tortoise, Red Gate SQL Source Control, Red Gate SQL Connect, Git Bash, etc…)? Why did you make those particular software choices? Any interesting anecdotes to share in regard to your use of source control and SQL Server? I had some really great responses (I highly recommend going and reading them). I promised that the five best, most thought-provoking, responses (as determined by me) would win one of five pairs of licenses for Red Gate SQL Source Control and Red Gate SQL Connect; here are the five that I chose (note that if you responded but did not leave a means of getting in touch then you weren’t considered for one of the prizes – sorry): In general, I don't think the management overhead and licensing cost associated with TFS is worthwhile if all you're doing is using source control. To get value from TFS, at a minimum you need to be using team build, and possibly other stuff as well, such as the sharepoint integration. If that's all you need, then svn with Tortoise would be my first choice. If you want to add build automation later, you can do this with cruisecontrol (is it still called that?), JetBrains, etc. For a long time I thought that Redgate's claims about "bridging the SSMS-VS divide" were a load of hot air, since in my experience anyone who knew what they were doing was using Visual Studio, in particular SSDT and its predecessors. However, on a recent client I was putting in source control for the first time, and I discovered that the "divide" really does exist. That client has ended up using svn with Redgate SQL Source Control, with no build automation, but with scope to add it in the future. Gavin Campbell I think putting the DB under source control is a great idea.  I have issues with the earlier versions of SQL Source Control in that it provides little help in versioning the DB. I think the latest version merges SQL Compare and SQL Source Control together.  Which is how it should have been all along. Sure I have the DB scripts in SVN, but I can't automate DB builds and changes without more tools.  Frankly I'm surprised databases don't have some sort of versioning built into them. Nick Portelli Source control has been immensely useful and saved me from a lot of rework on more than one occasion.  I have learned that you have to be extremely careful checking in data.  Our system is internal only so during the system production run once a week, if there is a problem that I can fix easily(for example, a control table points to a file in the wrong environment), I'll do it directly in production so the run can continue as soon as possible since we have a specified time window.  We do full test runs to minimize this but it has come up once or twice.  We use Red-Gate source control to "push" from the test environment to the production environment.  There have been a couple of occasions where the test environment with the wrong setting was pushed back over the production environment because the change was made only in production.  Gotta keep an eye on that. Alan Dykes Goodness is it manual.  And can be extremely painful at times.  Not only are we running thin, we are constrained on the tools we can get ($$ must mean free).  Certainly no excuse, and a great opportunity to improve my skills by learning new things.  But...  Getting buy in a on a proven process or methodology is hard, takes time, and diverts us from development.  If SQL Source Control is easy to use and proven oh boy could you get some serious fans around here!  Seriously though, as the "accidental dba" of this shop any new ideas / easy to implement tools can make a world of difference in productivity and most importantly accuracy.  Manual = bad. :) John Hennesey (who left his email address) The one thing I would love to know more about is the unique challenges of working with databases as source code - you can store scripts, but are they written as deployment scripts with all the logic about how to apply them to an existing DB? Where is that baseline DB? Where's the data? How does a team share the data and the code? It's a real challenge. Merrill Aldrich Congratulations to the five of you. Red Gate will be in touch with you soon about your free licenses. Thank you to all those that responded. And again, go and check out all the responses – those above are only small proportion from what is a very interesting comment thread. @Jamiet

    Read the article

  • How do I DRY up my CouchDB views?

    - by James A. Rosen
    What can I do to share code among views in CouchDB? Example 1 -- utility methods Jesse Hallett has some good utility methods, including function dot(attr) { return function(obj) { return obj[attr]; } } Array.prototype.map = function(func) { var i, r = [], for (i = 0; i < this.length; i += 1) { r[i] = func(this[i]); } return r; }; ... Where can I put this code so every view can access it? Example 2 -- constants Similarly for constants I use in my application. Where do I put MyApp = { A_CONSTANT = "..."; ANOTHER_CONSTANT = "..."; }; Example 3 -- filter of a filter: What if I want a one view that filters by "is this a rich person?": function(doc) { if (doc.type == 'person' && doc.net_worth > 1000000) { emit(doc.id, doc); } } and another that indexes by last name: function(doc) { if (doc.last_name) { emit(doc.last_name, doc); } } How can I combine them into a "rich people by last name" view? I sort of want the equivalent of the Ruby my_array.select { |x| x.person? }.select { |x| x.net_worth > 1,000,000 }.map { |x| [x.last_name, x] } How can I be DRYer?

    Read the article

  • Combining XmlWriter objects?

    - by Kevin
    The way my application is structured, each component generates output as XML and returns an XmlWriter object. Before rendering the final output to the page, I combine all XML and perform an XSL transformation on that object. Below, is a simplified code sample of the application structure. Does it make sense to combine XmlWriter objects like this? Is there a better way to structure my application? The optimal solution would be one where I didn't have to pass a single XmlWriter instance as a parameter to each component. function page1Xml() { $content = new XmlWriter(); $content->openMemory(); $content->startElement('content'); $content->text('Sample content'); $content->endElement(); return $content; } function generateSiteMap() { $sitemap = new XmlWriter(); $sitemap->openMemory(); $sitemap->startElement('sitemap'); $sitemap->startElement('page'); $sitemap->writeAttribute('href', 'page1.php'); $sitemap->text('Page 1'); $sitemap->endElement(); $sitemap->endElement(); return $sitemap; } function output($content) { $doc = new XmlWriter(); $doc->openMemory(); $doc->writePi('xml-stylesheet', 'type="text/xsl" href="template.xsl"'); $doc->startElement('document'); $doc->writeRaw( generateSiteMap()->outputMemory() ); $doc->writeRaw( $content->outputMemory() ); $doc->endElement(); $doc->endDocument(); $output = xslTransform($doc); return $output; } $content = page1Xml(); echo output($content); Update: I may abandon XmlWriter altogether and use DomDocument instead. It is more flexible and it also seemed to perform better (at least on the crude tests I created).

    Read the article

  • A training world nugget for being taught by the best

    - by Testas
    June represents an exciting time for the SQL Server community with events all over the country in the next few months and there is plenty of knowledge to be gained from willing speakers enthusiastically sharing their knowledge. Furthermore, Paul Randall and Kimberley Trip will be conducting their highly recommended immersion events at London Heathrow in June.There are other big names within SQL Server that will be teaching this year. The company I used to work for, QA, has excellent trainers teaching SQL Server who I would always recommend. Occasionally a big name speaker will be take a course, unknowingly to the community. Solid Quality Mentors is such a company where their staff will teach at QA offices from time to time. And I know from conversation with Itzik Ben-Gan that he will be teaching Advanced TSQL within QA offices in London during the week of Oct 3-7. A link to the course details can be found here.http://www.qa.com/training-courses/technical-it-training/microsoft/microsoft-sql-server/microsoft-sql-server-2008-and-r2/advanced-t-sql-querying,-programming-and-tuning-for-sql-server-2005--2008So if you want to be taught by the best experts, consider checking www.QA.com for their advanced SQL courses, you could find yourself being taught by the best in the business in their field.Chris  

    Read the article

  • SSRS Export to Excel not working through VPN (Juniper SA4000)

    - by Veynom
    We have a SharePoint (MOSS 2007 on Win2003 R2) with SSRS reports (from SQL 2005) embedded in it. When we connect to the SharePoint portal through our VPN (firewall is Juniper SA4000) and using Internet Explorer (6, 7, and 8) and try to export any SSRS report under Excel, we get an error message: Internet Explorer cannot download . Internet Explorer was not able to open the internet site. The requested site is either unavailable or cannot be found. Please try again later. When not using the VPN (LAN from the office), everything (exporting under Excel) works fine. When using Firefox through the VPN, it works fine. When exporting to any other format (pdf or text or whatever), everything is fine under both IE and FF. Our firewall people suspect something in SSRS/MOSS/Office. Our MOSS consultants suspect something in the firewall Juniper SA4000. When using Fiddler and when not connected through VPN, I see the following traffic once i click on the "Export button": (Response was a request for client credentials) GET /ReportServer/Reserved.ReportViewerWebControl.axd?ExecutionID=j1pqbvbqkb34qf45fhlgnx55&ControlID=733607a7d607476abb1e6b8794202158&Culture=127&UICulture=9&ReportStack=1&OpType=Export&FileName=Product+Application+Report&ContentDisposition=OnlyHtmlInline&Format=EXCEL HTTP/1.1 Accept: */* Accept-Language: en-US,fr-be;q=0.5 User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; GTB5; .NET CLR 2.0.50727; .NET CLR 1.1.4322; InfoPath.2; .NET CLR 3.0.04506.648; .NET CLR 3.5.21022; MS-RTC LM 8; OfficeLiveConnector.1.3; OfficeLivePatch.0.0; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729) Accept-Encoding: gzip, deflate Connection: Keep-Alive Host: r1frchcurdb01.r1.group.corp HTTP/1.1 401 Unauthorized Content-Length: 1656 Content-Type: text/html Server: Microsoft-IIS/6.0 WWW-Authenticate: Negotiate WWW-Authenticate: NTLM X-Powered-By: ASP.NET Date: Mon, 08 Jun 2009 09:25:21 GMT Proxy-Support: Session-Based-Authentication then (Generic Response successful): GET /ReportServer/Reserved.ReportViewerWebControl.axd?ExecutionID=j1pqbvbqkb34qf45fhlgnx55&ControlID=733607a7d607476abb1e6b8794202158&Culture=127&UICulture=9&ReportStack=1&OpType=Export&FileName=Product+Application+Report&ContentDisposition=OnlyHtmlInline&Format=EXCEL HTTP/1.1 Accept: */* Accept-Language: en-US,fr-be;q=0.5 User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; GTB5; .NET CLR 2.0.50727; .NET CLR 1.1.4322; InfoPath.2; .NET CLR 3.0.04506.648; .NET CLR 3.5.21022; MS-RTC LM 8; OfficeLiveConnector.1.3; OfficeLivePatch.0.0; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729) Accept-Encoding: gzip, deflate Connection: Keep-Alive Host: r1frchcurdb01.r1.group.corp Authorization: Negotiate YIIIewYGKwYBBQUCoIIIbzCCCGugJDAiBgkqhkiC9xIBAgIGCSqGSIb3EgECAgYKKwYBBAGCNwICCqKCCEEEggg9YIIIOQYJKoZIhvcSAQICAQBugggoMIIIJKADAgEFoQMCAQ6iBwMFACAAAACjggdNYYIHSTCCB0WgAwIBBaEPGw1SMS5HUk9VUC5DT1JQoi4wLKADAgECoSUwIxsESFRUUBsbcjFmcmNoY3VyZGIwMS5yMS5ncm91cC5jb3Jwo4IG+zCCBvegAwIBF6EDAgEKooIG6QSCBuUaO7Wi3upEDz+lNA+GLwydOvR57pEzpN0hzKoBe4T6YCx8g05XW4A2cD3P29EuWz2fZiPeHFs8HYYCm7YLkXoKMayzZzgIG91s8Y/H9CZmfTH06rmJfWLyg47AvJ1gkgCLnOxqy1fHuTCeuUpDIkx2eG/bbMM1yYcgR94m+R59qeNwaNFxQJwFt8xpGFeZbfPSPK6mFRDKwbVuyW+c5BmL+SfGfPY0LBPo6wVLihS2vOgNLH/UVyoJZm5XZWTG4ou++o7AdCEoacxdKvXWWMMKweokjFVvrsffDKhu6ecgRxSpvgwNT35Wy6pvrw8hXD0QwhkJ0dB6CAfrpBtvhLThnjDZJdnP+Y+0IMAcDfPpNBteZUGCcJAdmcq/Fi+eSz6Y9T9xKJoTxdT1VssTDAlP0aHsXNmnwx7OcWo5Zgyh0Cucjd7G5LCQeodj0H65dMipYOurtCnL1K8DYwQBnxD8AtkSrPfL1LM1lWTiUCjMtNmX7oIU5mFzQzr3DqbB3TPN4tp+1xulSHuSjqGaiTmbrwaV6m0dtpGlmOO148Gzgclf5k+eZhdITje6mTZyFkunr2spXnEv7uzjghx73YKRGK/nXXQdfEWIrmUyDbc5qGxYBnXfhtg2xNdSIwTgZJTEIm7Mqnat3mpPtQiw8PUPUNfByPSlFMpvZcmy6tJoAnUyFrJfJ9NgXenVcj+YZOAo/uhFcpXgNHV0o2hi8j4vJ982HgRSB1zw+yEPQUp1P61lVIsl/M893lY2sgwFk+DCGmNwbumAKn5gDfq702yy1Drb9Vk6OwlhPXp0c98BFDvZj7sWxHE6e30f1RMOVTN1btngiWr+xO3Dnt+GpKPeKe3ISNcKyzU0zT+kaqx59IPOCRYFB/ZWBmW5+VIV3IxtdRf/rlc7b269JyUZx+xN8n5Stg/fq/XrAbKgXUlnwXGmUGh0deqzqciZc4uFFzrgA6Ba2ZG/3OIcAWPZck37YvIqiUzeustijlGr969ahw2ugHksGuKcihzCpdKLoEciphgpf+kGkXQfSnIuQCqD5N8Vx/uNs7IY87geu3+hwk1GpxUHmikb8/v6qiNL2W5LMIVrSXxuuT1Czoomqxrcic/F9PbBzk62pItS/fhWj3rj23GqwQOCzsQ+UeGTxf2JGtNZsS5nbfv9/7ZYxH0fqUdjHfJpZHd4f5Mk7L+5taNmsyY0BKU0QxIWJQZkLihpsXDInWQ67IUN5nsvvo7Ix1Rh1mZ3Wf/Z9a91O3QhUt3g1AsuzpZtWC+2oFs887KAwkGKr0ex/2pWZtzlCDCULk28f64XYcd+HYQ4Tv1l48okH43jXhSNZnLTTLM4zHcCBpBuBoiVuuKY+haHwuf50C0esn/3EdSf32P/Jvuc/Y12PlqhG8j8fPOmSfTAeT2Rh0aLGLlyOfOTdj/+Uy2RhYwck4mUr1BMJVyLCGCTsMxvxrmQj+bSQbzStj8YrDsQM6/vw3fMs+ZX3NKsADwmRL3PhNQXR1+HVD+oWC2XWITTMVO+SJjl937mvVMjNYsC1sdDUzVlyCLSJxNaa+xLEKymM33tp+4GixDar5QTaTntcFqGUZ/YSvrcBFjQOgXBS62OqIFxKWy6csu3K7eIoprDfhNS/la+4gC871TaRw6rwQNMLwsVjYDgq+e8MXACKXr7jEZyojpT5aOvd0QQte8ThCv82JYTrJvnBIiXWQSNt0MH0ynP7h0uG1eip0RH2VF5nQ7SVNmyVGDXJu/VpVAY2lgfFNIzJS9SxPXGHOn7EZ7cEcOoF7XiVV2baNwqloH3GntBTBFYZ+9N6mH6lRSH8HbOlzCXCQjZEV+qvBwzKVWuC5WbKR26KkjfQ3Ntsjo9cGV5I51Mhve7j7Hkttf1otbOzrj8+ErpFJd9/5UEC4qo6KmQ0Tar3rsfcvbBr18uzxEjV8LY+ZM6WTn5H/6hb/2eFwnVieWOhi1l+O0t+vGyUJpPwwPQQ6hzbxKIWPGjLK4lQnTbNRdQFvpmfIOKAMFui3w1SgxsWNxmMJJeGViJKBP592JzAbu3vFGPdAfnFEOb3ffTOid/hrTu/hAZRSjK+YQEKrju6sohPkHRoRyI+tlyYEWTyQ7oe3fR64J2lQJ8bCt8xtRVYIX8NY2gPvRCLTHbB3jSrK+Fnn7zqASM5yYC+JSiezdqEMkVsgc5HHH1mQxsMS8EgR6mj42QErDtKSKuGMExCPRW/8jvMK0eGepfA2U/WOiv2x2AgamQw1sI8sZtsBdD4jnP5WNkZSJ0i0SuybiCUiArPJgXjJTL+zgEX6kGOMrCFCtArunu6pDaXPQvAKngLFXc4CGdpMnDaVoeCyVO9J32tNfEauxEpIG9MIG6oAMCAReigbIEga8OSk5Xji0SAuy9NwxX1G4Avd5wZWj/OgZZ2VnSIiB7GpMJ4Dsnr/6jfLGj1V9V1zpHNoZG18ipW5R+OlF2rm7T4QKaZWsgwBvuyS5xtn9FGJqGYEmQLSqXbXsLfdT6xqPhB3poBbBtzPGCX/ZgW38iTyJOcbT6QW7TAj340+5EiZvy61jLFU1HtJ3wv41Hv6puSWCLotoS85SPerGz+SVYOZUbmKcx2lTXUTPinPhN HTTP/1.1 200 OK Date: Mon, 08 Jun 2009 09:25:21 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET WWW-Authenticate: Negotiate oYGgMIGdoAMKAQChCwYJKoZIgvcSAQICooGIBIGFYIGCBgkqhkiG9xIBAgICAG9zMHGgAwIBBaEDAgEPomUwY6ADAgEXolwEWm70xlMp4oj/PyvriNMeNDigow6/MX2DpaYQdBfGkiF0Dcc323tHLRBxBL03QpvwdGBxZGAJI6V1G8sc/lVBzhlCNsZkbJcNfnMNgOgc7UPrz+ZVav/EVm3sDQ== X-AspNet-Version: 2.0.50727 Content-Disposition: attachment; filename="Product Application Report.xls" Cache-Control: private Expires: Mon, 08 Jun 2009 09:24:21 GMT Content-Type: application/vnd.ms-excel Content-Length: 23012 When using the VPN, I see no traffic in Fiddler and the error message is displayed before anything else. Update 17/06/2009: I could get a hand on some logs from our SA4000. Maybe this could help more. Info PTR23232 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Start Policy [WEBURL/PROTOCOL] evaluation for resource http://<DB server>:80/ReportServer/Reserved.ReportViewerWebControl.axd?ExecutionID=rua1g355tic24245f2e13lim&ControlID=44168efcd36e461493f7a69962580b91&Culture=127&UICulture=9&ReportStack=1&OpType=Export&FileName=Product+Application+Report&ContentDisposition=OnlyHtmlInline&Format=EXCEL Info PTR23233 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Applying Policy [Enable HTTP 1.1]... Info PTR23240 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Resource filter [http://nsrvnts2:80/*] does not match Info PTR23240 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Resource filter [http://nsrvnts3:80/*] does not match Info PTR23233 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Applying Policy [Disable HTTP 1.1]... Info PTR23239 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Action [HTTP 1.0] is returned Info PTR23234 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Policy [Disable HTTP 1.1] applies to resource Info PTR23308 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Skip Policy [WEBURL/COMPRESSION] evaluation because Compression option is not enabled Info PTR23232 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Start Policy [WEBURL/WEBPDSID] evaluation for resource http://<DB server>:80/ReportServer/Reserved.ReportViewerWebControl.axd?ExecutionID=rua1g355tic24245f2e13lim&ControlID=44168efcd36e461493f7a69962580b91&Culture=127&UICulture=9&ReportStack=1&OpType=Export&FileName=Product+Application+Report&ContentDisposition=OnlyHtmlInline&Format=EXCEL Info PTR23233 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Applying Policy [Corporate BI Portal]... Info PTR23240 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Resource filter [http://<SharePoint>:80/*] does not match Info PTR23240 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - Resource filter [http://<SharePoint>/*] does not match Info PTR23235 2009/06/15 17:22:38 - <SA4000> - [<SA4000 IP>] - <user>[SA4000 group names] - No Policy applies to resource Any tip welcome. :)

    Read the article

  • Getting SQL table row counts via sysindexes vs. sys.indexes

    - by Bill Osuch
    Among the many useful SQL snippets I regularly use is this little bit that will return row counts in a table: SELECT so.name as TableName, MAX(si.rows) as [RowCount] FROM sysobjects so JOIN sysindexes si ON si.id = OBJECT_ID(so.name) WHERE so.xtype = 'U' GROUP BY so.name ORDER BY [RowCount] DESC This is handy to find tables that have grown wildly, zero-row tables that could (possibly) be dropped, or other clues into the data. Right off the bat you may spot some "non-ideal" code - I'm using sysobjects rather than sys.objects. What's the difference? In SQL Server 2005 and later, sysobjects is no longer a table, but a "compatibility view", meant for backward compatibility only. SELECT * from each and you'll see the different data that each returns. Microsoft advises that sysindexes could be removed in a future version of SQL Server, but this has never really been an issue for me since my company is still using SQL 2000. However, there are murmurs that we may actually migrate to 2008 some year, so I might as well go ahead and start using an updated version of this snippet on the servers that can handle it: SELECT so.name as TableName, ddps.row_count as [RowCount] FROM sys.objects so JOIN sys.indexes si ON si.OBJECT_ID = so.OBJECT_ID JOIN sys.dm_db_partition_stats AS ddps ON si.OBJECT_ID = ddps.OBJECT_ID  AND si.index_id = ddps.index_id WHERE si.index_id < 2  AND so.is_ms_shipped = 0 ORDER BY ddps.row_count DESC

    Read the article

  • The emergence of Atlassian's Bamboo (and a free SQL Source Control license offer!)

    - by David Atkinson
    The rise in demand for database continuous integration has forced me to skill-up in various new tools and technologies, particularly build servers. We have been using JetBrain's TeamCity here at Red Gate for a couple of years now, having replaced the ageing CruiseControl.NET, so it was a natural choice for us to use this for our database CI demos. Most of our early adopter customers have also transitioned away from CruiseControl, the majority to TeamCity and Microsoft's TeamBuild. However, more recently, for reasons we've yet to fully comprehend, we've observed a significant surge in the number of evaluators for Atlassian's Bamboo. I installed this a couple of weeks back to satisfy myself that it works seamlessly with Red Gate tools. As you would expect Bamboo's UI has the same clean feel found in any Atlassian tool (we use JIRA extensively here at Red Gate). In the coming weeks I will post a short step-by-step guide to setting up SQL Server continuous integration using the Red Gate command lines. To help us further optimize the integration between these tools I'd be very keen to hear from any Bamboo users who also use Red Gate tools who might be willing to participate in usability tests and other similar research in exchange for Amazon vouchers. If you are interested in helping out please contact me at David dot Atkinson at red-gate.com I recently spoke with Sarah, the product marketing manager for Bamboo, and we ended up having a detailed conversation about database CI, which has been meticulously documented in the form of a blog post on Atlassian's website: http://blogs.atlassian.com/2012/05/database-continuous-integration-redgate/ We've also managed to persuade Red Gate marketing to provide a great free-tool offer, provide a free SQL Source Control or SQL Connect license to Atlassian users provided it is claimed before the end of June! Full details are at the bottom of the post. Technorati Tags: sql server

    Read the article

  • The emergence of Atlassian's Bamboo (and a free SQL Source Control license offer!)

    - by David Atkinson
    The rise in demand for database continuous integration has forced me to skill-up in various new tools and technologies, particularly build servers. We have been using JetBrain's TeamCity here at Red Gate for a couple of years now, having replaced the ageing CruiseControl.NET, so it was a natural choice for us to use this for our database CI demos. Most of our early adopter customers have also transitioned away from CruiseControl, the majority to TeamCity and Microsoft's TeamBuild. However, more recently, for reasons we've yet to fully comprehend, we've observed a significant surge in the number of evaluators for Atlassian's Bamboo. I installed this a couple of weeks back to satisfy myself that it works seamlessly with Red Gate tools. As you would expect Bamboo's UI has the same clean feel found in any Atlassian tool (we use JIRA extensively here at Red Gate). In the coming weeks I will post a short step-by-step guide to setting up SQL Server continuous integration using the Red Gate command lines. To help us further optimize the integration between these tools I'd be very keen to hear from any Bamboo users who also use Red Gate tools who might be willing to participate in usability tests and other similar research in exchange for Amazon vouchers. If you are interested in helping out please contact me at David dot Atkinson at red-gate.com I recently spoke with Sarah, the product marketing manager for Bamboo, and we ended up having a detailed conversation about database CI, which has been meticulously documented in the form of a blog post on Atlassian's website: http://blogs.atlassian.com/2012/05/database-continuous-integration-redgate/ We've also managed to persuade Red Gate marketing to provide a great free-tool offer, provide a free SQL Source Control or SQL Connect license to Atlassian users provided it is claimed before the end of June! Full details are at the bottom of the post. Technorati Tags: sql server

    Read the article

  • why installing lame it is getting failed

    - by Rahul Mehta
    I want to install ffmpeg with mp3lame enabled for this m following this tutorial , http://ubuntuforums.org/showpost.php?p=9868359&postcount=1289 and in step 2 error is libfaac is not found ? and in step 5 installing lame is giving this error , why it is getting failed , please advised what to do ? reach121@youngib:~/lame-3.98.4$ sudo checkinstall --pkgname=lame-ffmpeg --pkgversion="3.98.4" --backup=no --default --deldoc=yes checkinstall 1.6.2, Copyright 2009 Felipe Eduardo Sanchez Diaz Duran This software is released under the GNU GPL. ***************************************** **** Debian package creation selected *** ***************************************** This package will be built according to these values: 0 - Maintainer: [ root@youngib ] 1 - Summary: [ Package created with checkinstall 1.6.2 ] 2 - Name: [ lame-ffmpeg ] 3 - Version: [ 3.98.4 ] 4 - Release: [ 1 ] 5 - License: [ GPL ] 6 - Group: [ checkinstall ] 7 - Architecture: [ amd64 ] 8 - Source location: [ lame-3.98.4 ] 9 - Alternate source location: [ ] 10 - Requires: [ ] 11 - Provides: [ lame-ffmpeg ] 12 - Conflicts: [ ] 13 - Replaces: [ ] Enter a number to change any of them or press ENTER to continue: Installing with make install... ========================= Installation results =========================== Making install in mpglib make[1]: Entering directory `/home/reach121/lame-3.98.4/mpglib' make[2]: Entering directory `/home/reach121/lame-3.98.4/mpglib' make[2]: Nothing to be done for `install-exec-am'. make[2]: Nothing to be done for `install-data-am'. make[2]: Leaving directory `/home/reach121/lame-3.98.4/mpglib' make[1]: Leaving directory `/home/reach121/lame-3.98.4/mpglib' Making install in libmp3lame make[1]: Entering directory `/home/reach121/lame-3.98.4/libmp3lame' Making install in i386 make[2]: Entering directory `/home/reach121/lame-3.98.4/libmp3lame/i386' make[3]: Entering directory `/home/reach121/lame-3.98.4/libmp3lame/i386' make[3]: Nothing to be done for `install-exec-am'. make[3]: Nothing to be done for `install-data-am'. make[3]: Leaving directory `/home/reach121/lame-3.98.4/libmp3lame/i386' make[2]: Leaving directory `/home/reach121/lame-3.98.4/libmp3lame/i386' Making install in vector make[2]: Entering directory `/home/reach121/lame-3.98.4/libmp3lame/vector' make[3]: Entering directory `/home/reach121/lame-3.98.4/libmp3lame/vector' make[3]: Nothing to be done for `install-exec-am'. make[3]: Nothing to be done for `install-data-am'. make[3]: Leaving directory `/home/reach121/lame-3.98.4/libmp3lame/vector' make[2]: Leaving directory `/home/reach121/lame-3.98.4/libmp3lame/vector' make[2]: Entering directory `/home/reach121/lame-3.98.4/libmp3lame' make[3]: Entering directory `/home/reach121/lame-3.98.4/libmp3lame' test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib" /bin/bash ../libtool --mode=install /usr/bin/install -c 'libmp3lame.la' '/usr/local/lib/libmp3lame.la' /usr/bin/install -c .libs/libmp3lame.lai /usr/local/lib/libmp3lame.la /usr/bin/install -c .libs/libmp3lame.a /usr/local/lib/libmp3lame.a chmod 644 /usr/local/lib/libmp3lame.a ranlib /usr/local/lib/libmp3lame.a PATH="$PATH:/sbin" ldconfig -n /usr/local/lib ---------------------------------------------------------------------- Libraries have been installed in: /usr/local/lib If you ever happen to want to link against installed libraries in a given directory, LIBDIR, you must either use libtool, and specify the full pathname of the library, or use the `-LLIBDIR' flag during linking and do at least one of the following: - add LIBDIR to the `LD_LIBRARY_PATH' environment variable during execution - add LIBDIR to the `LD_RUN_PATH' environment variable during linking - use the `-Wl,--rpath -Wl,LIBDIR' linker flag - have your system administrator add LIBDIR to `/etc/ld.so.conf' See any operating system documentation about shared libraries for more information, such as the ld(1) and ld.so(8) manual pages. ---------------------------------------------------------------------- make[3]: Nothing to be done for `install-data-am'. make[3]: Leaving directory `/home/reach121/lame-3.98.4/libmp3lame' make[2]: Leaving directory `/home/reach121/lame-3.98.4/libmp3lame' make[1]: Leaving directory `/home/reach121/lame-3.98.4/libmp3lame' Making install in frontend make[1]: Entering directory `/home/reach121/lame-3.98.4/frontend' make[2]: Entering directory `/home/reach121/lame-3.98.4/frontend' test -z "/usr/local/bin" || /bin/mkdir -p "/usr/local/bin" /bin/bash ../libtool --mode=install /usr/bin/install -c 'lame' '/usr/local/bin/lame' /usr/bin/install -c lame /usr/local/bin/lame make[2]: Nothing to be done for `install-data-am'. make[2]: Leaving directory `/home/reach121/lame-3.98.4/frontend' make[1]: Leaving directory `/home/reach121/lame-3.98.4/frontend' Making install in Dll make[1]: Entering directory `/home/reach121/lame-3.98.4/Dll' make[2]: Entering directory `/home/reach121/lame-3.98.4/Dll' make[2]: Nothing to be done for `install-exec-am'. make[2]: Nothing to be done for `install-data-am'. make[2]: Leaving directory `/home/reach121/lame-3.98.4/Dll' make[1]: Leaving directory `/home/reach121/lame-3.98.4/Dll' Making install in debian make[1]: Entering directory `/home/reach121/lame-3.98.4/debian' make[2]: Entering directory `/home/reach121/lame-3.98.4/debian' make[2]: Nothing to be done for `install-exec-am'. make[2]: Nothing to be done for `install-data-am'. make[2]: Leaving directory `/home/reach121/lame-3.98.4/debian' make[1]: Leaving directory `/home/reach121/lame-3.98.4/debian' Making install in doc make[1]: Entering directory `/home/reach121/lame-3.98.4/doc' Making install in html make[2]: Entering directory `/home/reach121/lame-3.98.4/doc/html' make[3]: Entering directory `/home/reach121/lame-3.98.4/doc/html' make[3]: Nothing to be done for `install-exec-am'. test -z "/usr/local/share/doc/lame/html" || /bin/mkdir -p "/usr/local/share/doc/lame/html" /bin/mkdir: cannot create directory `/usr/local/share/doc': No such file or directory make[3]: *** [install-pkghtmlDATA] Error 1 make[3]: Leaving directory `/home/reach121/lame-3.98.4/doc/html' make[2]: *** [install-am] Error 2 make[2]: Leaving directory `/home/reach121/lame-3.98.4/doc/html' make[1]: *** [install-recursive] Error 1 make[1]: Leaving directory `/home/reach121/lame-3.98.4/doc' make: *** [install-recursive] Error 1 **** Installation failed. Aborting package creation. Cleaning up...OK Bye. reach121@youngib:~/lame-3.98.4$

    Read the article

  • Oracle Inroduces a New Line of Defense for Databases

    - by roxana.bradescu
    Today at the 2011 RSA Conference, we announced the immediate availability of our new Oracle Database Firewall, the latest addition to a comprehensive portfolio of database security solutions. Oracle Database Firewall is a network-based software solution that monitors database traffic, and can detect and block SQL injection and other attacks from reaching Oracle and non-Oracle databases. According to the 2010 Verizon Data Breach Investigations Report, SQL injection attacks against databases are responsible for 89% of all breached data. SQL injection attacks are a technique for controlling responses from the database server through applications. This attack exploits the inherent trust between application layer and the back-end database. Previously the only way organizations had to safeguard against SQL injection attacks was a complete overhaul of their application code. Obviously a very costly, complex, and often impossible undertaking for most organizations. Enter the new Oracle Database Firewall. It can help prevent SQL injection attacks by establishing a defensive perimeter around your databases. The Oracle Database Firewall uses an innovative SQL grammar analysis to inspect the database traffic against pre-defined policies. Normal expected traffic is allowed to pass (and can be optionally logged to demonstrate regulatory compliance), ensuring no false positives or disruption to your business. SQL statements that are explicitly forbidden or unknown SQL statements can either pass, be logged, alert, block or be substitute with pre-defined SQL statements. Being able to substitute an unknown potentially harmful SQL statement with a harmless statement is especially powerful since it foils an attack while allowing the application to operate normally and preventing DoS attacks. So, if you're at RSA, stop by our booth or attend the session with Steve Moyle, Oracle Database Firewall CTO. Or if you want to learn more immediately, please watch our on-demand webcast and download the new Oracle Database Firewall Resource Kit with everything you need to get started today.

    Read the article

  • Why does apt-get install Skype easily while aptitude complains about MAJOR dependency errors?

    - by Prateek
    I was trying to install Skype on Ubuntu 13.04, from the Canonical repositories. With apt-get it worked easily, while aptitude had a huge problem with dependencies and proposed a complicated solution. Why is this so? Why doesn't aptitude offer whatever apt-get does as a potential solution? Here is the output of both: apt-get install skype: Reading package lists... Building dependency tree... Reading state information... The following extra packages will be installed: gcc-4.7-base:i386 libasound2 libasound2:i386 libasound2-plugins:i386 libasyncns0:i386 libaudio2:i386 libavahi-client3:i386 libavahi-common-data:i386 libavahi-common3:i386 libc6:i386 libcomerr2:i386 libcups2:i386 libdbus-1-3 libdbus-1-3:i386 libdbusmenu-qt2:i386 libdrm-intel1 libdrm-intel1:i386 libdrm-nouveau2 libdrm-nouveau2:i386 libdrm-radeon1 libdrm-radeon1:i386 libdrm2 libdrm2:i386 libexpat1:i386 libffi6:i386 libflac8:i386 libfontconfig1:i386 libfreetype6:i386 libgcc1:i386 libgcrypt11 libgcrypt11:i386 libgl1-mesa-dri libgl1-mesa-dri:i386 libgl1-mesa-glx:i386 libglapi-mesa:i386 libglib2.0-0:i386 libgnutls26 libgnutls26:i386 libgpg-error0:i386 libgssapi-krb5-2:i386 libgstreamer-plugins-base0.10-0:i386 libgstreamer0.10-0:i386 libice6:i386 libjack-jackd2-0:i386 libjbig0:i386 libjpeg-turbo8:i386 libjpeg8:i386 libjson0:i386 libk5crypto3:i386 libkeyutils1:i386 libkrb5-3:i386 libkrb5support0:i386 liblcms1:i386 libllvm3.2:i386 liblzma5:i386 libmng1:i386 libmysqlclient18:i386 libogg0:i386 liborc-0.4-0:i386 libp11-kit0:i386 libpciaccess0:i386 libpcre3:i386 libpng12-0:i386 libpulse0:i386 libqt4-dbus libqt4-dbus:i386 libqt4-declarative libqt4-declarative:i386 libqt4-designer libqt4-help libqt4-network libqt4-network:i386 libqt4-opengl libqt4-opengl:i386 libqt4-script libqt4-script:i386 libqt4-scripttools libqt4-sql libqt4-sql:i386 libqt4-sql-mysql:i386 libqt4-sql-sqlite libqt4-svg libqt4-test libqt4-xml libqt4-xml:i386 libqt4-xmlpatterns libqt4-xmlpatterns:i386 libqtcore4 libqtcore4:i386 libqtgui4 libqtgui4:i386 libqtwebkit4:i386 libsamplerate0:i386 libselinux1:i386 libsm6:i386 libsndfile1:i386 libspeexdsp1:i386 libsqlite3-0:i386 libssl1.0.0 libssl1.0.0:i386 libstdc++6:i386 libtasn1-3:i386 libtiff5 libtiff5:i386 libtxc-dxtn-s2tc0:i386 libuuid1:i386 libvorbis0a:i386 libvorbisenc2:i386 libwrap0:i386 libx11-6 libx11-6:i386 libx11-xcb1 libx11-xcb1:i386 libxau6:i386 libxcb-dri2-0 libxcb-dri2-0:i386 libxcb-glx0 libxcb-glx0:i386 libxcb1 libxcb1:i386 libxdamage1:i386 libxdmcp6:i386 libxext6 libxext6:i386 libxfixes3 libxfixes3:i386 libxi6 libxi6:i386 libxml2 libxml2:i386 libxrender1 libxrender1:i386 libxslt1.1:i386 libxss1:i386 libxt6 libxt6:i386 libxv1 libxv1:i386 libxxf86vm1 libxxf86vm1:i386 mysql-common qdbus skype-bin:i386 sni-qt:i386 zlib1g:i386 Suggested packages: nas:i386 glibc-doc:i386 locales:i386 rng-tools rng-tools:i386 libglide3 libglide3:i386 gnutls-bin gnutls-bin:i386 krb5-doc:i386 krb5-user:i386 libvisual-0.4-plugins:i386 gstreamer-codec-install:i386 gnome-codec-install:i386 gstreamer0.10-tools:i386 gstreamer0.10-plugins-base:i386 jackd2:i386 liblcms-utils:i386 pulseaudio:i386 libqt4-declarative-folderlistmodel libqt4-declarative-gestures libqt4-declarative-particles libqt4-declarative-shaders qt4-qmlviewer libqt4-declarative-folderlistmodel:i386 libqt4-declarative-gestures:i386 libqt4-declarative-particles:i386 libqt4-declarative-shaders:i386 qt4-qmlviewer:i386 libqt4-dev libqt4-dev:i386 libthai0:i386 libicu48:i386 qt4-qtconfig qt4-qtconfig:i386 Recommended packages: libtxc-dxtn0:i386 xml-core:i386 The following NEW packages will be installed gcc-4.7-base:i386 libasound2:i386 libasound2-plugins:i386 libasyncns0:i386 libaudio2:i386 libavahi-client3:i386 libavahi-common-data:i386 libavahi-common3:i386 libc6:i386 libcomerr2:i386 libcups2:i386 libdbus-1-3:i386 libdbusmenu-qt2:i386 libdrm-intel1:i386 libdrm-nouveau2:i386 libdrm-radeon1:i386 libdrm2:i386 libexpat1:i386 libffi6:i386 libflac8:i386 libfontconfig1:i386 libfreetype6:i386 libgcc1:i386 libgcrypt11:i386 libgl1-mesa-dri:i386 libgl1-mesa-glx:i386 libglapi-mesa:i386 libglib2.0-0:i386 libgnutls26:i386 libgpg-error0:i386 libgssapi-krb5-2:i386 libgstreamer-plugins-base0.10-0:i386 libgstreamer0.10-0:i386 libice6:i386 libjack-jackd2-0:i386 libjbig0:i386 libjpeg-turbo8:i386 libjpeg8:i386 libjson0:i386 libk5crypto3:i386 libkeyutils1:i386 libkrb5-3:i386 libkrb5support0:i386 liblcms1:i386 libllvm3.2:i386 liblzma5:i386 libmng1:i386 libmysqlclient18:i386 libogg0:i386 liborc-0.4-0:i386 libp11-kit0:i386 libpciaccess0:i386 libpcre3:i386 libpng12-0:i386 libpulse0:i386 libqt4-dbus:i386 libqt4-declarative:i386 libqt4-network:i386 libqt4-opengl:i386 libqt4-script:i386 libqt4-sql:i386 libqt4-sql-mysql:i386 libqt4-xml:i386 libqt4-xmlpatterns:i386 libqtcore4:i386 libqtgui4:i386 libqtwebkit4:i386 libsamplerate0:i386 libselinux1:i386 libsm6:i386 libsndfile1:i386 libspeexdsp1:i386 libsqlite3-0:i386 libssl1.0.0:i386 libstdc++6:i386 libtasn1-3:i386 libtiff5:i386 libtxc-dxtn-s2tc0:i386 libuuid1:i386 libvorbis0a:i386 libvorbisenc2:i386 libwrap0:i386 libx11-6:i386 libx11-xcb1:i386 libxau6:i386 libxcb-dri2-0:i386 libxcb-glx0:i386 libxcb1:i386 libxdamage1:i386 libxdmcp6:i386 libxext6:i386 libxfixes3:i386 libxi6:i386 libxml2:i386 libxrender1:i386 libxslt1.1:i386 libxss1:i386 libxt6:i386 libxv1:i386 libxxf86vm1:i386 mysql-common skype skype-bin:i386 sni-qt:i386 zlib1g:i386 The following packages will be upgraded: libasound2 libdbus-1-3 libdrm-intel1 libdrm-nouveau2 libdrm-radeon1 libdrm2 libgcrypt11 libgl1-mesa-dri libgnutls26 libqt4-dbus libqt4-declarative libqt4-designer libqt4-help libqt4-network libqt4-opengl libqt4-script libqt4-scripttools libqt4-sql libqt4-sql-sqlite libqt4-svg libqt4-test libqt4-xml libqt4-xmlpatterns libqtcore4 libqtgui4 libssl1.0.0 libtiff5 libx11-6 libx11-xcb1 libxcb-dri2-0 libxcb-glx0 libxcb1 libxext6 libxfixes3 libxi6 libxml2 libxrender1 libxt6 libxv1 libxxf86vm1 qdbus 41 upgraded, 105 newly installed, 0 to remove and 138 not upgraded. Need to get 85.9 MB/89.2 MB of archives. After this operation, 204 MB of additional disk space will be used. Do you want to continue [Y/n]? aptitude install skype: Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initialising package states... The following NEW packages will be installed: gcc-4.7-base:i386{a} libasound2:i386{a} libasound2-plugins:i386{a} libasyncns0:i386{a} libaudio2:i386{a} libavahi-client3:i386{a} libavahi-common-data:i386{a} libavahi-common3:i386{a} libc6:i386{a} libcomerr2:i386{a} libcups2:i386{a} libdbus-1-3:i386{a} libdbusmenu-qt2:i386{a} libdrm-intel1:i386{a} libdrm-nouveau2:i386{a} libdrm-radeon1:i386{a} libdrm2:i386{a} libexpat1:i386{a} libffi6:i386{a} libflac8:i386{a} libfontconfig1:i386{a} libfreetype6:i386{a} libgcc1:i386{a} libgcrypt11:i386{a} libgl1-mesa-dri:i386{a} libgl1-mesa-glx:i386{a} libglapi-mesa:i386{a} libglib2.0-0:i386{a} libgnutls26:i386{a} libgpg-error0:i386{a} libgssapi-krb5-2:i386{a} libgstreamer-plugins-base0.10-0:i386{a} libgstreamer0.10-0:i386{a} libice6:i386{a} libjack-jackd2-0:i386{a} libjbig0:i386{a} libjpeg-turbo8:i386{a} libjpeg8:i386{a} libjson0:i386{a} libk5crypto3:i386{a} libkeyutils1:i386{a} libkrb5-3:i386{a} libkrb5support0:i386{a} liblcms1:i386{a} libllvm3.2:i386{a} liblzma5:i386{a} libmng1:i386{a} libmysqlclient18:i386{a} libogg0:i386{a} liborc-0.4-0:i386{a} libp11-kit0:i386{a} libpciaccess0:i386{a} libpcre3:i386{a} libpng12-0:i386{a} libpulse0:i386{a} libqt4-dbus:i386{a} libqt4-declarative:i386{a} libqt4-network:i386{a} libqt4-opengl:i386{a} libqt4-script:i386{a} libqt4-sql:i386{a} libqt4-sql-mysql:i386{a} libqt4-xml:i386{a} libqt4-xmlpatterns:i386{a} libqtcore4:i386{a} libqtgui4:i386{a} libqtwebkit4:i386{a} libsamplerate0:i386{a} libselinux1:i386{a} libsm6:i386{a} libsndfile1:i386{a} libspeexdsp1:i386{a} libsqlite3-0:i386{a} libssl1.0.0:i386{a} libstdc++6:i386{a} libtasn1-3:i386{a} libtiff5:i386{a} libtxc-dxtn-s2tc0:i386{a} libuuid1:i386{a} libvorbis0a:i386{a} libvorbisenc2:i386{a} libwrap0:i386{a} libx11-6:i386{a} libx11-xcb1:i386{a} libxau6:i386{a} libxcb-dri2-0:i386{a} libxcb-glx0:i386{a} libxcb1:i386{a} libxdamage1:i386{a} libxdmcp6:i386{a} libxext6:i386{a} libxfixes3:i386{a} libxi6:i386{a} libxml2:i386{a} libxrender1:i386{a} libxslt1.1:i386{a} libxss1:i386{a} libxt6:i386{a} libxv1:i386{a} libxxf86vm1:i386{a} mysql-common{a} skype skype-bin:i386{a} sni-qt:i386{a} zlib1g:i386{a} The following packages will be upgraded: libasound2 libdbus-1-3 libdrm-intel1 libdrm-nouveau2 libdrm-radeon1 libdrm2 libgcrypt11 libgl1-mesa-dri libgnutls26 libqt4-dbus libqt4-declarative libqt4-network libqt4-opengl libqt4-script libqt4-sql libqt4-xml libqt4-xmlpatterns libqtcore4 libqtgui4 libssl1.0.0 libtiff5 libx11-6 libx11-xcb1 libxcb-dri2-0 libxcb-glx0 libxcb1 libxext6 libxfixes3 libxi6 libxml2 libxrender1 libxt6 libxv1 libxxf86vm1 qdbus 35 packages upgraded, 105 newly installed, 0 to remove and 144 not upgraded. Need to get 81.7 MB/85.0 MB of archives. After unpacking 204 MB will be used. The following packages have unmet dependencies: libqt4-test : Depends: libqtcore4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. libqt4-designer : Depends: libqt4-script (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqt4-xml (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqtcore4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqtgui4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. libqt4-sql-sqlite : Depends: libqt4-sql (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqtcore4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. libqt4-help : Depends: libqt4-network (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqt4-sql (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqtcore4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqtgui4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. libqt4-svg : Depends: libqtcore4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqtgui4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. libqt4-scripttools : Depends: libqt4-script (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqtcore4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. Depends: libqtgui4 (= 4:4.8.4+dfsg-0ubuntu9) but 4:4.8.4+dfsg-0ubuntu9.2 is to be installed. The following actions will resolve these dependencies: Remove the following packages: 1) account-plugin-aim 2) account-plugin-facebook 3) account-plugin-flickr 4) account-plugin-generic-oauth 5) account-plugin-google 6) account-plugin-jabber 7) account-plugin-salut 8) account-plugin-twitter 9) account-plugin-windows-live 10) account-plugin-yahoo 11) empathy 12) friends 13) friends-dispatcher 14) friends-facebook 15) friends-twitter 16) gir1.2-signon-1.0 17) gnome-control-center-signon 18) libaccount-plugin-1.0-0 19) libfriends0 20) libqt4-designer 21) libqt4-help 22) libqt4-scripttools 23) libqt4-sql-sqlite 24) libqt4-svg 25) libqt4-test 26) libsignon-glib1 27) mcp-account-manager-uoa 28) nautilus-sendto-empathy 29) python-qt4 30) shotwell 31) signon-plugin-oauth2 32) signon-plugin-password 33) signon-ui 34) signond 35) ubuntu-sso-client-qt 36) ubuntuone-control-panel-qt 37) unity-lens-friends 38) unity-lens-photos 39) unity-scope-gdrive 40) webaccounts-extension-common 41) xul-ext-webaccounts Leave the following dependencies unresolved: 42) mcp-account-manager-uoa recommends gnome-control-center-signon 43) mcp-account-manager-uoa recommends account-plugin-aim 44) mcp-account-manager-uoa recommends account-plugin-jabber 45) mcp-account-manager-uoa recommends account-plugin-google 46) mcp-account-manager-uoa recommends account-plugin-facebook 47) mcp-account-manager-uoa recommends account-plugin-windows-live 48) mcp-account-manager-uoa recommends account-plugin-yahoo 49) mcp-account-manager-uoa recommends account-plugin-salut 50) ubuntu-desktop recommends empathy 51) ubuntu-desktop recommends libqt4-sql-sqlite 52) ubuntu-desktop recommends shotwell 53) ubuntu-desktop recommends ubuntuone-control-panel-qt 54) ubuntu-desktop recommends xul-ext-webaccounts 55) unity recommends unity-lens-photos 56) unity recommends unity-lens-friends 57) unity-lens-files recommends unity-scope-gdrive 58) libqt4-sql recommends libqt4-sql-mysql | libqt4-sql-odbc | libqt4-sql-ps Accept this solution? [Y/n/q/?] And in case this helps, aptitude show skype: Package: skype State: not installed Version: 4.2.0.11-0ubuntu0.12.04.2 Priority: extra Section: net Maintainer: Steve Langasek <[email protected]> Architecture: amd64 Uncompressed Size: 62.5 k Depends: skype-bin Conflicts: skype Description: client for Skype VOIP and instant messaging service Skype is software that enables the world's conversations. Millions of individuals and businesses use Skype to make free video and voice calls, send instant messages and share files with other Skype users. Every day, people also use Skype to make low-cost calls to landlines and mobiles. * Make free Skype-to-Skype calls to anyone else, anywhere in the world. * Call to landlines and mobiles at great rates. * Group chat with up to 200 people or conference call with up to 25 others. * Free to download.

    Read the article

  • SQL IO and SAN troubles

    - by James
    We are running two servers with identical software setup but different hardware. The first one is a VM on VMWare on a normal tower server with dual core xeons, 16 GB RAM and a 7200 RPM drive. The second one is a VM on XenServer on a powerful brand new rack server, with 4 core xeons and shared storage. We are running Dynamics AX 2012 and SQL Server 2008 R2. When I insert 15 000 records into a table on the slow tower server (as a test), it does so in 13 seconds. On the fast server it takes 33 seconds. I re-ran these tests several times with the same results. I have a feeling it is some sort of IO bottleneck, so I ran SQLIO on both. Here are the results for the slow tower server: C:\Program Files (x86)\SQLIO>test.bat C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -frandom -b8 -BH -LS C:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 14318180 counts per second 8 threads writing for 120 secs to file C:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: C:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 226.97 MBs/sec: 1.77 latency metrics: Min_Latency(ms): 0 Avg_Latency(ms): 281 Max_Latency(ms): 467 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 99 C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -frandom -b8 -BH -LS C:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 14318180 counts per second 8 threads reading for 120 secs from file C:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: C:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 91.34 MBs/sec: 0.71 latency metrics: Min_Latency(ms): 14 Avg_Latency(ms): 699 Max_Latency(ms): 1124 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -fsequential -b64 -BH -LS C :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 14318180 counts per second 8 threads writing for 120 secs to file C:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: C:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 1094.50 MBs/sec: 68.40 latency metrics: Min_Latency(ms): 0 Avg_Latency(ms): 58 Max_Latency(ms): 467 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -fsequential -b64 -BH -LS C :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 14318180 counts per second 8 threads reading for 120 secs from file C:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: C:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 1155.31 MBs/sec: 72.20 latency metrics: Min_Latency(ms): 17 Avg_Latency(ms): 55 Max_Latency(ms): 205 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 Here are the results of the fast rack server: C:\Program Files (x86)\SQLIO>test.bat C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -frandom -b8 -BH -LS E:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads writing for 120 secs to file E:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) open_file: CreateFile (E:\TestFile.dat for write): The system cannot find the pa th specified. exiting C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -frandom -b8 -BH -LS E:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads reading for 120 secs from file E:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) open_file: CreateFile (E:\TestFile.dat for read): The system cannot find the pat h specified. exiting C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -fsequential -b64 -BH -LS E :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads writing for 120 secs to file E:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) open_file: CreateFile (E:\TestFile.dat for write): The system cannot find the pa th specified. exiting C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -fsequential -b64 -BH -LS E :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads reading for 120 secs from file E:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) open_file: CreateFile (E:\TestFile.dat for read): The system cannot find the pat h specified. exiting C:\Program Files (x86)\SQLIO>test.bat C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -frandom -b8 -BH -LS c:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads writing for 120 secs to file c:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: c:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 2575.77 MBs/sec: 20.12 latency metrics: Min_Latency(ms): 1 Avg_Latency(ms): 24 Max_Latency(ms): 655 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 5 8 9 9 9 8 5 3 1 1 1 1 0 0 0 0 0 0 0 0 0 37 C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -frandom -b8 -BH -LS c:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads reading for 120 secs from file c:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: c:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 1141.39 MBs/sec: 8.91 latency metrics: Min_Latency(ms): 1 Avg_Latency(ms): 55 Max_Latency(ms): 652 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 91 C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -fsequential -b64 -BH -LS c :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads writing for 120 secs to file c:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: c:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 341.37 MBs/sec: 21.33 latency metrics: Min_Latency(ms): 5 Avg_Latency(ms): 186 Max_Latency(ms): 120037 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -fsequential -b64 -BH -LS c :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads reading for 120 secs from file c:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: c:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 1024.07 MBs/sec: 64.00 latency metrics: Min_Latency(ms): 5 Avg_Latency(ms): 61 Max_Latency(ms): 81632 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 Three of the four tests are, to my mind, within reasonable parameters for the rack server. However, the 64 write test is incredibly slow on the rack server. (68 mb/sec on the slow tower vs 21 mb/s on the rack). The read speed for 64k also seems slow. Is this enough to say there is some sort of bottleneck with the shared storage? I need to know if I can take this evidence and say we need to launch an investigation into this. Any help is appreciated.

    Read the article

  • Statements of direction for EPM 11.1.1.x series products

    - by THE
    Some of the older parts of EPM that have been replaced with newer software will phase out after January 2013. For most of these the 11.1.1.x Series will be the last release. They will then only be supported via sustaining support (see policy). We have notes about: the Essbase Excel Add In (replaced by SmartView which nearly achieved functionality parity with release 11.1.2.1.102) Oracle Essbase Spreadsheet Add-in Statement of Direction (Doc ID 1466700.1) Hyperion Data Integration Management (replaced by Oracle Data Integrator ( ODI )) Hyperion Data Integration Management Statement of Direction (Doc ID 1267051.1) Hyperion Enterprise and Enterprise Reporting (replaced by HFM) Hyperion Enterprise and Hyperion Enterprise Reporting Statement of Direction (Doc ID 1396504.1) Hyperion Business Rules (replaced by Calculation Manager) Hyperion Business Rules Statement of Direction (Doc ID 1448421.1) Oracle Visual Explorer (this one phased out in June 11 already - just in case anyone missed it) Oracle Essbase Visual Explorer Statement of Direction (Doc ID 1327945.1) For a complete list of the Supported Lifetimes, please review the "Oracle Lifetime Support Policy for Applications"

    Read the article

  • Getting Windows Azure SDK 1.1 To Talk To A Local DB

    - by Richard Jones
    Just found this, if you’re using Azure 1.1,  which you probably will be if yo'u’ve moved to Visual Studio 2010. To change the default database to something other than sqlexpress for Development Storage do this - Look at this - http://msdn.microsoft.com/en-us/library/dd203058.aspx At the bottom it states -   Using Development Storage with SQL Server Express 2008 By default the local Windows Group BUILTIN\Administrator is not included in the SQL Server sysadmin server role on new SQL Server Express 2008 installations.  Add yourself to the sysadmin role in order to use the Development Storage Services on SQL Server Express 2008.  See SQL Server 2008 Security Changes for more information. Changing the SQL Server instance used by Development Storage By default, the Development Storage will use the SQL Express instance.  This can be changed by calling “DSInit.exe /sqlinstance:<SQL Server instance>” from the Windows Azure SDK command prompt.

    Read the article

  • Managing common code on Windows 7 (.NET) and Windows 8 (WinRT)

    - by ryanabr
    Recent announcements regarding Windows Phone 8 and the fact that it will have the WinRT behind it might make some of this less painful but I  discovered the "XmlDocument" object is in a new location in WinRT and is almost the same as it's brother in .NET System.Xml.XmlDocument (.NET) Windows.Data.Xml.Dom.XmlDocument (WinRT) The problem I am trying to solve is how to work with both types in the code that performs the same task on both Windows Phone 7 and Windows 8 platforms. The first thing I did was define my own XmlNode and XmlNodeList classes that wrap the actual Microsoft objects so that by using the "#if" compiler directive either work with the WinRT version of the type, or the .NET version from the calling code easily. public class XmlNode     { #if WIN8         public Windows.Data.Xml.Dom.IXmlNode Node { get; set; }         public XmlNode(Windows.Data.Xml.Dom.IXmlNode xmlNode)         {             Node = xmlNode;         } #endif #if !WIN8 public System.Xml.XmlNode Node { get; set ; } public XmlNode(System.Xml.XmlNode xmlNode)         {             Node = xmlNode;         } #endif     } public class XmlNodeList     { #if WIN8         public Windows.Data.Xml.Dom.XmlNodeList List { get; set; }         public int Count {get {return (int)List.Count;}}         public XmlNodeList(Windows.Data.Xml.Dom.XmlNodeList list)         {             List = list;         } #endif #if !WIN8 public System.Xml.XmlNodeList List { get; set ; } public int Count { get { return List.Count;}} public XmlNodeList(System.Xml.XmlNodeList list)         {             List = list;        } #endif     } From there I can then use my XmlNode and XmlNodeList in the calling code with out having to clutter the code with all of the additional #if switches. The challenge after this was the code that worked directly with the XMLDocument object needed to be seperate on both platforms since the method for populating the XmlDocument object is completly different on both platforms. To solve this issue. I made partial classes, one partial class for .NET and one for WinRT. Both projects have Links to the Partial Class that contains the code that is the same for the majority of the class, and the partial class contains the code that is unique to the version of the XmlDocument. The files with the little arrow in the lower left corner denotes 'linked files' and are shared in multiple projects but only exist in one location in source control. You can see that the _Win7 partial class is included directly in the project since it include code that is only for the .NET platform, where as it's cousin the _Win8 (not pictured above) has all of the code specific to the _Win8 platform. In the _Win7 partial class is this code: public partial class WUndergroundViewModel     { public static WUndergroundData GetWeatherData( double lat, double lng)         { WUndergroundData data = new WUndergroundData();             System.Net. WebClient c = new System.Net. WebClient(); string req = "http://api.wunderground.com/api/xxx/yesterday/conditions/forecast/q/[LAT],[LNG].xml" ;             req = req.Replace( "[LAT]" , lat.ToString());             req = req.Replace( "[LNG]" , lng.ToString()); XmlDocument doc = new XmlDocument();             doc.Load(c.OpenRead(req)); foreach (XmlNode item in doc.SelectNodes("/response/features/feature" ))             { switch (item.Node.InnerText)                 { case "yesterday" :                         ParseForecast( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/txt_forecast/forecastdays/forecastday" )), new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/simpleforecast/forecastdays/forecastday" )), data); break ; case "conditions" :                         ParseCurrent( new FishingControls.XmlNode (doc.SelectSingleNode("/response/current_observation" )), data); break ; case "forecast" :                         ParseYesterday( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/history/observations/observation" )),data); break ;                 }             } return data;         }     } in _win8 partial class is this code: public partial class WUndergroundViewModel     { public async static Task< WUndergroundData > GetWeatherData(double lat, double lng)         { WUndergroundData data = new WUndergroundData (); HttpClient c = new HttpClient (); string req = "http://api.wunderground.com/api/xxxx/yesterday/conditions/forecast/q/[LAT],[LNG].xml" ;             req = req.Replace( "[LAT]" , lat.ToString());             req = req.Replace( "[LNG]" , lng.ToString()); HttpResponseMessage msg = await c.GetAsync(req); string stream = await msg.Content.ReadAsStringAsync(); XmlDocument doc = new XmlDocument ();             doc.LoadXml(stream, null); foreach ( IXmlNode item in doc.SelectNodes("/response/features/feature" ))             { switch (item.InnerText)                 { case "yesterday" :                         ParseForecast( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/txt_forecast/forecastdays/forecastday" )), new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/simpleforecast/forecastdays/forecastday" )), data); break; case "conditions" :                         ParseCurrent( new FishingControls.XmlNode (doc.SelectSingleNode("/response/current_observation" )), data); break; case "forecast" :                         ParseYesterday( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/history/observations/observation")), data); break;                 }             } return data;         }     } Summary: This method allows me to have common 'business' code for both platforms that is pretty clean, and I manage the technology differences separately. Thank you tostringtheory for your suggestion, I was considering that approach.

    Read the article

  • New and Noteworthy Fixed Assets Notes

    - by Oracle_EBS
    A new white paper for Integrating Oracle Inventory Transactions Into Oracle Projects To Generate Asset Lines & Interface Assets To Fixed Assets (Doc ID 1392743.1) A listing of available Oracle E-Business Fixed Assets Diagnostics (Doc ID 1362875.1) Information on the knowledge management enhancements made in My Oracle Support Knowledge Management Version 6.0 Release (Doc ID 1393516.1) The new Period Close Advisor for the Release 12 E-Business Suite (Doc ID 335.1).  What is the Period Close Advisor?  The Period Close Advisor provides guidance on recommended period end procedures for E-Business Release 12.x.  It is intended to be generic and does not relate to a specific organization or industry.  Step by step best practices with tips and troubleshooting references are provided to assist you through each phase.  The EBS R12 Period Close Advisor for Assets data can also be found in a standalone note (Doc ID 1359475.1)

    Read the article

  • How to input data into user defined variables into MySql query

    - by user292791
    Simple Shell script echo "Enter 1 for month of March" echo "Enter 2 for month of April" echo "Enter 3 for month of May" read Month case "$Month" in 1) echo "enter establishment name" read a; mysql -u root -p $a < "March.sql";; 2) echo "enter establishment name" read b; mysql -u root -p $b < "April.sql";; 3) echo "enter establishment name" read c; mysql -u root -p $c < "May.sql";; esac done In this i have three other query files March.sql, April.sql, May.sql. i'm linking this in shell script . Example of .sql file: SELECT DISTINCT substr( a.case_no, 3, 2 ), b.case_type, b.type_name, a.case_no into outfile '/tmp/April.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' FROM Civil_t AS a, Case_type_t AS b, disposal_proc AS c WHERE substr( a.case_no, 3, 2 ) = b.case_type AND a.date_of_decision BETWEEN '2014-04-01' AND '2014-04-30' AND a.case_no = c.case_no AND a.court_no =1; I have to alter the .sql script every time. Is there any method to read the variables from shell script and use it in mysql. For example:- echo "enter date" read a #input date Now i have read a "date" and i want to use it in March.sql query in where clause. Is there is any method of using this variable in .sql query.

    Read the article

  • ??Database Replay Capture????

    - by Liu Maclean(???)
    Database Replay?11g??????,??workload capture??????????????,???????? ??Workload Capture???????: ???????????????,???????2????,??????,???????????OLTP???????capture 10????1G???? ?????: ????????????????????? ??startup restrict????,?????????? ??capture???restrict?? ????????????? ???????????????: ??scn???????? ???????? ???????? Capture???????????workload????? ???????SYSDBA?SYSOPER????OS?? ????: ?TPCC???capture??????4.5% ????session????64KB??? ???Workload Capture?????????? ????????2?, ??RAC????workload capture  file??????????????,??start_capture????? ????session????64KB???,??????????????workload  capture file????Server Process??????,?????????parse???execution????,Server Process??LOGON?LOGOFF?SQL??????????PGA?,???WCR Capture PG?WCR Capture PGA?,?PGA?????????????????,Server Process???????????WCR???,?????WCR???Server Process??’WCR: capture file IO write’????? ?WCR?????????: SQL> select * from v$version; BANNER -------------------------------------------------------------------------------- Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production PL/SQL Release 11.2.0.3.0 - Production CORE 11.2.0.3.0 Production TNS for Linux: Version 11.2.0.3.0 - Production NLSRTL Version 11.2.0.3.0 - Production SQL> select name from v$event_name where name like '%WCR%'; NAME ---------------------------------------------------------------- WCR: replay client notify WCR: replay clock WCR: replay lock order WCR: replay paused WCR: RAC message context busy WCR: capture file IO write WCR: Sync context busy latch: WCR: sync latch: WCR: processes HT 11g????????WCR???LATCH 1* select name,gets from v$latch where name like '%WCR%' SQL> / NAME GETS ------------------------------ ---------- WCR: kecu cas mem 3 WCR: kecr File Count 37 WCR: MMON Create dir 1 WCR: ticker cache 0 WCR: sync 495 WCR: processes HT 0 WCR: MTS VC queue 0 7 rows selected. ????????????Database Replay Capture????? 1. ????capture dbms_workload_capture.start_capture CREATE OR REPLACE DIRECTORY dbcapture AS '/home/oracle/dbcapture'; execute dbms_workload_capture.start_capture('CAPTURE','DBCAPTURE',default_action=>'INCLUDE'); SQL> select id,name,status,start_time,end_time,connects,user_calls,dir_path from dba_workload_captures where id = (select max(id) from dba_workload_captures) ; ID ---------- NAME -------------------------------------------------------------------------------- STATUS START_TIM END_TIME CONNECTS ---------------------------------------- --------- --------- ---------- USER_CALLS ---------- DIR_PATH -------------------------------------------------------------------------------- 1 CAPTURE IN PROGRESS 08-DEC-12 11 ID ---------- NAME -------------------------------------------------------------------------------- STATUS START_TIM END_TIME CONNECTS ---------------------------------------- --------- --------- ---------- USER_CALLS ---------- DIR_PATH -------------------------------------------------------------------------------- 167 /home/oracle/dbcapture 2. ?? capture file?? [oracle@mlab2 dbcapture]$ ls -lR .: total 8 drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 cap drwxr-xr-x 3 oracle oinstall 4096 Dec 8 07:24 capfiles -rw-r--r-- 1 oracle oinstall 0 Dec 8 07:24 wcr_cap_00001.start ./cap: total 4 -rw-r--r-- 1 oracle oinstall 91 Dec 8 07:24 wcr_scapture.wmd ./capfiles: total 4 drwxr-xr-x 12 oracle oinstall 4096 Dec 8 07:24 inst1 ./capfiles/inst1: total 40 drwxr-xr-x 2 oracle oinstall 4096 Dec 8 08:31 aa drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 ab drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 ac drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 ad drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 ae drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 af drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 ag drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 ah drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 ai drwxr-xr-x 2 oracle oinstall 4096 Dec 8 07:24 aj ./capfiles/inst1/aa: total 316 -rw-r--r-- 1 oracle oinstall 1762 Dec 8 07:25 wcr_c6cdah0000001.rec -rw-r--r-- 1 oracle oinstall 16478 Dec 8 07:28 wcr_c6cf1h0000002.rec -rw-r--r-- 1 oracle oinstall 1772 Dec 8 07:29 wcr_c6cjdh0000004.rec -rw-r--r-- 1 oracle oinstall 1535 Dec 8 07:29 wcr_c6cnah0000005.rec -rw-r--r-- 1 oracle oinstall 1821 Dec 8 07:41 wcr_c6cpfh0000007.rec -rw-r--r-- 1 oracle oinstall 1815 Dec 8 07:33 wcr_c6cq6h000000a.rec -rw-r--r-- 1 oracle oinstall 1535 Dec 8 07:34 wcr_c6cxmh000000h.rec -rw-r--r-- 1 oracle oinstall 1427 Dec 8 07:41 wcr_c6cxvh000000j.rec -rw-r--r-- 1 oracle oinstall 1425 Dec 8 07:41 wcr_c6czph000000k.rec -rw-r--r-- 1 oracle oinstall 2398 Dec 8 07:49 wcr_c6dqfh000000q.rec -rw-r--r-- 1 oracle oinstall 259321 Dec 8 08:35 wcr_c6du7h000000r.rec -rw-r--r-- 1 oracle oinstall 0 Dec 8 07:55 wcr_c6f6yh000000t.rec -rw-r--r-- 1 oracle oinstall 0 Dec 8 08:28 wcr_c6h3qh0000013.rec ./capfiles/inst1/ab: total 0 ./capfiles/inst1/ac: total 0 ./capfiles/inst1/ad: total 0 ./capfiles/inst1/ae: total 0 ./capfiles/inst1/af: total 0 ./capfiles/inst1/ag: total 0 ./capfiles/inst1/ah: total 0 ./capfiles/inst1/ai: total 0 ./capfiles/inst1/aj: total 0 [oracle@mlab2 dbcapture]$ cd ./capfiles/inst1/aa [oracle@mlab2 aa]$ ls -l total 316 -rw-r--r-- 1 oracle oinstall 1762 Dec 8 07:25 wcr_c6cdah0000001.rec -rw-r--r-- 1 oracle oinstall 16478 Dec 8 07:28 wcr_c6cf1h0000002.rec -rw-r--r-- 1 oracle oinstall 1772 Dec 8 07:29 wcr_c6cjdh0000004.rec -rw-r--r-- 1 oracle oinstall 1535 Dec 8 07:29 wcr_c6cnah0000005.rec -rw-r--r-- 1 oracle oinstall 1821 Dec 8 07:41 wcr_c6cpfh0000007.rec -rw-r--r-- 1 oracle oinstall 1815 Dec 8 07:33 wcr_c6cq6h000000a.rec -rw-r--r-- 1 oracle oinstall 1535 Dec 8 07:34 wcr_c6cxmh000000h.rec -rw-r--r-- 1 oracle oinstall 1427 Dec 8 07:41 wcr_c6cxvh000000j.rec -rw-r--r-- 1 oracle oinstall 1425 Dec 8 07:41 wcr_c6czph000000k.rec -rw-r--r-- 1 oracle oinstall 2398 Dec 8 07:49 wcr_c6dqfh000000q.rec -rw-r--r-- 1 oracle oinstall 259321 Dec 8 08:35 wcr_c6du7h000000r.rec -rw-r--r-- 1 oracle oinstall 0 Dec 8 07:55 wcr_c6f6yh000000t.rec -rw-r--r-- 1 oracle oinstall 0 Dec 8 08:28 wcr_c6h3qh0000013.rec [oracle@mlab2 aa]$ ls -l |wc -l 14 ???????14??? 3. ??LOGON????Server Process [oracle@mlab2 ~]$ sqlplus / as sysdba SQL*Plus: Release 11.2.0.3.0 Production on Sat Dec 8 08:37:40 2012 Copyright (c) 1982, 2011, Oracle. All rights reserved. Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production With the Partitioning, Automatic Storage Management, OLAP, Data Mining and Real Application Testing options ?????wcr?? [oracle@mlab2 aa]$ ls -ltr total 316 -rw-r--r-- 1 oracle oinstall 1762 Dec 8 07:25 wcr_c6cdah0000001.rec -rw-r--r-- 1 oracle oinstall 16478 Dec 8 07:28 wcr_c6cf1h0000002.rec -rw-r--r-- 1 oracle oinstall 1772 Dec 8 07:29 wcr_c6cjdh0000004.rec -rw-r--r-- 1 oracle oinstall 1535 Dec 8 07:29 wcr_c6cnah0000005.rec -rw-r--r-- 1 oracle oinstall 1815 Dec 8 07:33 wcr_c6cq6h000000a.rec -rw-r--r-- 1 oracle oinstall 1535 Dec 8 07:34 wcr_c6cxmh000000h.rec -rw-r--r-- 1 oracle oinstall 1425 Dec 8 07:41 wcr_c6czph000000k.rec -rw-r--r-- 1 oracle oinstall 1427 Dec 8 07:41 wcr_c6cxvh000000j.rec -rw-r--r-- 1 oracle oinstall 1821 Dec 8 07:41 wcr_c6cpfh0000007.rec -rw-r--r-- 1 oracle oinstall 2398 Dec 8 07:49 wcr_c6dqfh000000q.rec -rw-r--r-- 1 oracle oinstall 0 Dec 8 07:55 wcr_c6f6yh000000t.rec -rw-r--r-- 1 oracle oinstall 0 Dec 8 08:28 wcr_c6h3qh0000013.rec -rw-r--r-- 1 oracle oinstall 259321 Dec 8 08:35 wcr_c6du7h000000r.rec -rw-r--r-- 1 oracle oinstall 0 Dec 8 08:37 wcr_c6hp4h0000018.rec ??????wcr_c6hp4h0000018.rec ??? SQL> select spid from v$process where addr = ( select paddr from v$session where sid=(select distinct sid from v$mystat)); SPID ------------------------ 14293 ????????????????14293, ???????????????,??????wcr_c6hp4h0000018.rec [oracle@mlab2 ~]$ ls -l /proc/14293/fd total 0 lr-x------ 1 oracle oinstall 64 Dec 8 08:39 0 -> /dev/null l-wx------ 1 oracle oinstall 64 Dec 8 08:39 1 -> /dev/null lrwx------ 1 oracle oinstall 64 Dec 8 08:39 10 -> /u01/app/oracle/product/11201/db_1/rdbms/audit/CRMV_ora_14293_1.aud l-wx------ 1 oracle oinstall 64 Dec 8 08:39 11 -> /u01/app/oracle/diag/rdbms/crmv/CRMV/trace/CRMV_ora_14293.trc l-wx------ 1 oracle oinstall 64 Dec 8 08:39 12 -> pipe:[34585895] l-wx------ 1 oracle oinstall 64 Dec 8 08:39 13 -> /u01/app/oracle/diag/rdbms/crmv/CRMV/trace/CRMV_ora_14293.trm l-wx------ 1 oracle oinstall 64 Dec 8 08:39 2 -> /dev/null lr-x------ 1 oracle oinstall 64 Dec 8 08:39 3 -> /dev/null lr-x------ 1 oracle oinstall 64 Dec 8 08:39 4 -> /dev/null lr-x------ 1 oracle oinstall 64 Dec 8 08:39 5 -> /u01/app/oracle/product/11201/db_1/rdbms/mesg/oraus.msb lr-x------ 1 oracle oinstall 64 Dec 8 08:39 6 -> /proc/14293/fd lr-x------ 1 oracle oinstall 64 Dec 8 08:39 7 -> /dev/zero lrwx------ 1 oracle oinstall 64 Dec 8 08:39 8 -> /home/oracle/dbcapture/capfiles/inst1/aa/wcr_c6hp4h0000018.rec lr-x------ 1 oracle oinstall 64 Dec 8 08:39 9 -> pipe:[34585894] ?????lsof?? [root@mlab2 ~]# lsof|grep wcr_c6hp4h0000018.rec oracle 14293 oracle 8u REG 8,1 0 17629644 /home/oracle/dbcapture/capfiles/inst1/aa/wcr_c6hp4h0000018.rec ????????,??Server Process????WCR REC??,?Server Process LOGON?????? 3.????SQL??: SQL> select 1 from dual; 1 ---------- 1 SQL> / 1 ---------- 1 [oracle@mlab2 aa]$ strings wcr_c6hp4h0000018.rec ==»????SQL????, ??????? ??????SQL???,???????????????WCR??????,LOGON???????????SQL????,????????? [oracle@mlab2 aa]$ strings wcr_c6hp4h0000018.rec 11.2.0.3.0 *File header info. (Shadow process='14293') D0576B5D710A34F4E043B201A8C0ECFE SYS; NLS_LANGUAGE? AMERICAN> NLS_TERRITORY? AMERICA> NLS_CURRENCY? NLS_ISO_CURRENCY? AMERICA> NLS_NUMERIC_CHARACTERS? NLS_CALENDAR? GREGORIAN> NLS_DATE_FORMAT? DD-MON-RR> NLS_DATE_LANGUAGE? AMERICAN> NLS_CHARACTERSET? AL32UTF8> NLS_SORT? BINARY> NLS_TIME_FORMAT? HH.MI.SSXFF AM> NLS_TIMESTAMP_FORMAT? DD-MON-RR HH.MI.SSXFF AM> NLS_TIME_TZ_FORMAT? HH.MI.SSXFF AM TZR> NLS_TIMESTAMP_TZ_FORMAT? DD-MON-RR HH.MI.SSXFF AM TZR> NLS_DUAL_CURRENCY? NLS_SPECIAL_CHARS? NLS_NCHAR_CHARACTERSET? UTF8> NLS_COMP? BINARY> NLS_LENGTH_SEMANTICS? BYTE> NLS_NCHAR_CONV_EXCP? FALSE (DESCRIPTION=(ADDRESS=(PROTOCOL=beq)(PROGRAM=/u01/app/oracle/product/11201/db_1/bin/oracle)(ARGV0=oracleCRMV)(ARGS='(DESCRIPTION=(LOCAL=YES)(ADDRESS=(PROTOCOL=beq)))')(DETACH=NO))(CONNECT_DATA=(CID=(PROGRAM=sqlplus)(HOST=mlab2.oracle.com)(USER=oracle)))) ,[email protected] (TNS V1-V3)U tselect spid from v$process where addr = ( select paddr from v$session where sid=(select distinct sid from v$mystat)) ` _ select 1 from dual select 1 from dual ??????????????????? [oracle@mlab2 aa]$ strings wcr_c6hp4h0000018.rec 9`9_^B create table vva(t1 int) `:_i :`:_iB `;_^ ;`;_^B create table vva(t1 int) `_i >`>_iB FusC `?_^ ?`?_^B FvWC _begin for i in 1..50000 loop execute immediate 'select 1 from dual where 2='||i; end loop; end; ?SERVER PROCESS LOGOFF ??????? C`E_ B k^2C ????Server Process????parse?execution???WCR??,??????????PGA?,????????????,????????,?????WCR???????????,???????? 4. ?????? SQL> oradebug setmypid Statement processed. SQL> oradebug dump processstate 10; Statement processed. SQL> oradebug tracefile_name /u01/app/oracle/diag/rdbms/crmv/CRMV/trace/CRMV_ora_14293.trc ?processstate ??????????????? WCR: capture file IO write,??Server process??WCR ?? 3: waited for 'SQL*Net message to client' driver id=0x62657100, #bytes=0x1, =0x0 wait_id=139 seq_num=140 snap_id=1 wait times: snap=0.000007 sec, exc=0.000007 sec, total=0.000007 sec wait times: max=infinite wait counts: calls=0 os=0 occurred after 0.934091 sec of elapsed time 4: waited for 'latch: shared pool' address=0x60106b20, number=0x133, tries=0x0 wait_id=138 seq_num=139 snap_id=1 wait times: snap=0.000066 sec, exc=0.000066 sec, total=0.000066 sec wait times: max=infinite wait counts: calls=0 os=0 occurred after 1.180690 sec of elapsed time 5: waited for 'WCR: capture file IO write' =0x0, =0x0, =0x0 wait_id=137 seq_num=138 snap_id=1 wait times: snap=0.000189 sec, exc=0.000189 sec, total=0.000189 sec wait times: max=infinite wait counts: calls=0 os=0 occurred after 3.122783 sec of elapsed time 6: waited for 'WCR: capture file IO write' =0x0, =0x0, =0x0 wait_id=136 seq_num=137 snap_id=1 wait times: snap=0.000191 sec, exc=0.000191 sec, total=0.000191 sec wait times: max=infinite wait counts: calls=0 os=0 occurred after 3.053132 sec of elapsed time 7: waited for 'WCR: capture file IO write' 5.??PGA???? SQL> oradebug dump heapdump 536870917; Statement processed. grep WCR /u01/app/oracle/diag/rdbms/crmv/CRMV/trace/CRMV_ora_14293.trc Chunk 7fb1b606bfc0 sz= 65600 freeable "WCR Capture PG " ds=0x7fb1b6115f90 Chunk 7fb1b6111e18 sz= 4224 freeable "WCR Capture PG " ds=0x7fb1b6115f90 Chunk 7fb1b6112e98 sz= 4184 freeable "WCR Capture PG " ds=0x7fb1b6115f90 Chunk 7fb1b6113ef0 sz= 4224 freeable "WCR Capture PG " ds=0x7fb1b6115f90 Chunk 7fb1b6114f70 sz= 4104 recreate "WCR Capture PG " latch=(nil) Chunk 7fb1b6115f78 sz= 160 freeable "WCR Capture PGA" Chunk 7fb1b6116018 sz= 3248 freeable "WCR Capture PGA" Subheap ds=0x7fb1b6115f90 heap name= WCR Capture PG size= 82336 HEAP DUMP heap name="WCR Capture PG" desc=0x7fb1b6115f90 FIVE LARGEST SUB HEAPS for heap name="WCR Capture PG" desc=0x7fb1b6115f9 PGA???WCR Capture PG ?WCR Capture PGA?freeable or recreate??chunk,???????Server Process???OS Chunk 7fb1b606bfc0 sz= 65600 freeable "WCR Capture PG " ds=0x7fb1b6115f90 sz= 65600=» 64k ??????????64k??,???????????????64k WCR????????????:)! 6.???? ??WCR CAPTURE????????2? SQL> SELECT x.ksppinm NAME, y.ksppstvl VALUE, x.ksppdesc describ 2 FROM SYS.x$ksppi x, SYS.x$ksppcv y 3 WHERE x.inst_id = USERENV ('Instance') 4 AND y.inst_id = USERENV ('Instance') 5 AND x.indx = y.indx 6 AND x.ksppinm in ('_capture_buffer_size','_wcr_control'); NAME VALUE DESCRIB -------------------- -------------------- ------------------------------------------------------------ _wcr_control 0 Oracle internal test WCR parameter used ONLY for testing! _capture_buffer_size 65536 To set the size of the PGA I/O recording buffers ??_capture_buffer_size ??PGA?WCR BUFFER?SIZE,???64k _wcr_control ??WCR?????,?????? ????,??????: 1. ???WCR WORKLOAD CAPTURE???????????,??Server Process????(????)2. ???server process????WCR??3. Server Proess???LOGON?LOGOFF?SQL?????????WCR???4. Server Process????????Immediate mode,????????PGA?(WCR Capture) subheap?,??????????????(timeout?????)5. ????, Server Process????????Immediate mode,?capture????parse??execution??(?????capture???parse?????????????,parse????capture???),?????LOGON?SQL??(???????)??PGA?WCR Capture?????,???????,????????,??tpcc??????4.5%6. ????_capture_buffer_size ??PGA?WCR BUFFER?SIZE,???64k7. WCR Capture?????binrary 2????,?????,????????????????WCR capture file8. WCR: capture file IO write?????Server Process??WCR??

    Read the article

< Previous Page | 495 496 497 498 499 500 501 502 503 504 505 506  | Next Page >