Search Results

Search found 10345 results on 414 pages for 'ms excel'.

Page 190/414 | < Previous Page | 186 187 188 189 190 191 192 193 194 195 196 197  | Next Page >

  • MS SQL 2005: Why would a delete from a temp table hang forever?

    - by Dr. Zim
    DELETE FROM #tItem_ID WHERE #tItem_ID.Item_ID NOT IN ( SELECT DISTINCT Item_ID FROM Item_Keyword JOIN Keyword ON Item_Keyword.Keyword_ID = Keyword.Record_ID WHERE Keyword LIKE @tmpKW) The Keyword is %macaroni%. Both temp tables are 3300 items long. The inner select executes in under a second. All strings are nvarchar(249). All IDs are int. Any ideas? I executed it (it's in a stored proc) for over 12 minutes without it finishing.

    Read the article

  • Which MS technologies would be suited for a data intensive application?

    - by steve.tse
    I'm a junior VB.net developer with little application design knowledge. I've been reading a lot of material online regarding different design patterns, frameworks, and methodologies. It's become a bit confusing for me. Right now I'm trying to decide on what language would be best suited to convert an existing VB6 application (with SQL server backend.) I need to update the UI and add more user functionality and reporting capabilities. Initially I was thinking of using WPF and attempting the MVVM model for this big project. Reports would be generated from SSRS. A peer suggested using ASP.net and I don't have enough experience to determine what would be better. The senior programmers here are stuck on using VB6 and don't have any input on what to use. They are encouraging me to use the latest technologies. This application would be for ~20 users in a central location. Ideally I would stick to a Microsoft .net language. Current interface is similar to a datagrid table where the user would click in to see the detail of each record. They would need to have multiple records open at any given time. I look forward to all the advice I can get. EDIT 2010/04/22 2:47 PM EST What is your audience? Internal clients within an intranet How complex are the interactions you expect to implement? not very... displaying data from SQL server to UI. Allow user updates to said data. Typically just one user modifying a record. Do you require near real-time data updates? no How often do you expect to update the application after the first release? twice/year Do you expect a well-defined set of client platforms? Yes, windows xp environment, potentially upgrading to Win7. Currently in IE.6 moving to IE7 or 8 within a couple of months. Do users need access from anywhere? No, just from their PC.

    Read the article

  • MS 70-536 .NET Framework Foundation - info on Exam? Especially regex?

    - by Sebastian P.R. Gingter
    Hi, I know there are already some questions about this, but not that specific. I have the self-paced training kit and worked through the test exam tool that is on the CD coming with the book. I constantly fail on the test tool, mostly on the regex questions. I'm not a regex guru. In fact my regex-fu is more than weak. I know what regex'es are, how I can use them and where my 'Regular expressions - kurz und gut' book is in my drawer in case I really need them. And to be honest I feel like learning regex is a total waste of time, because if I need them I have either a colleague that is fit and can do them in just a few seconds or I need my book and get them right in a still fair amount of time. And from my experience I can tell that I need regex like once in two or three years. So just putting in a lot of time into learning just the expressions to pass the exam is.. something I like not to have to do. Can you tell me something about the real exam vs. the test exam tool on the book and about the need to know regex for passing it? Thank you for your time. Marked as Community Wiki. Hope that fits?

    Read the article

  • Is there a way to customise messages produced by statements in MS SQL Query Analyzer?

    - by Scott Leis
    If I run a simple query in SQL Query Analyzer, like: SELECT * FROM TableName the Messages pane always produces a message like: (30 row(s) affected) If I run a stored procedure with many statements, the messages are useless because there's no indication of what each one relates to. So firstly: Is there a way to customise the default messages on a per-query basis? E.g. I'd like a specific query to produce a message like: TableName query produced [numRowsAffected] results. replacing [numRowsAffected] with the number that would have appeared in the default message. Secondly, is there a way to suppress the default messages on a per-query basis? E.g. I have a local variable of type TABLE, used in several statements. I don't want any message to appear for statements where I'm just deleting data from that variable before re-using it. I'm seeking solutions that work in SQL Server 8.0.

    Read the article

  • How to synchronize two (or n) replication processes for MS SQL databases?

    - by Yauheni Sivukha
    There are two master databases and two read-only copies updated by standard transactional replication. It is needed to map some entity from both read-only databases, lets say that A databases contains orders and B databases contains lines. The problem is that replication to one database can lag behind replication of second database, and at the moment of mapping R-databases will have inconsistent data. For example. We stored 2 orders with lines at 19:00 and 19:03. Mapping process started at 19:05, but to the moment of mapping A database replication processed all changes up to 19:03, but B database replication processed only changes up to 19:00. After mapping we will have order entity with order as of 19:03 and lines as of 19:00. The troubles are guaranteed:) In my particular case both databases have temporal model, so it is possible to fetch data for every time slice, but the problem is to identify time of latest replication. Question: How to synchronize replication processes for several databases to avoid situation described above?

    Read the article

  • Using parameterized function calls in SELECT statements. MS SQL Server

    - by geekzlla
    I have taken over some code from a previous developer and have come across this SQL statement that calls several SQL functions. As you can see, the function calls in the select statement pass a parameter to the function. How does the SQL statement know what value to replace the variable with? For the below sample, how does the query engine know what to replace nDeptID with when it calls, fn_SelDeptName_DeptID(nDeptID)? nDeptID IS a column in table Note. SELECT STATEMENT: SELECT nCustomerID AS [Customer ID], nJobID AS [Job ID], dbo.fn_SelDeptName_DeptID(nDeptID) AS Department, nJobTaskID AS JobTaskID, dbo.fn_SelDeptTaskDesc_OpenTask(nJobID, nJobTaskID) AS Task, nStandardNoteID AS StandardNoteID, dbo.fn_SelNoteTypeDesc(nNoteID) AS [Note Type], dbo.fn_SelGPAStandardNote(nStandardNoteID) AS [Standard Note], nEntryDate AS [Entry Date], nUserName as [Added By], nType AS Type, nNote AS Note FROM Note WHERE nJobID = 844261 xORDER BY nJobID, Task, [Entry Date] ====================== Function fn_SelDeptName_DeptID: ALTER FUNCTION [dbo].[fn_SelDeptName_DeptID] (@iDeptID int) RETURNS varchar(25) -- Used by DataCollection for Job Tracking -- if the Deptartment isnt found return an empty string BEGIN -- Return the Department name for the given DeptID. DECLARE @strDeptName varchar(25) IF @iDeptID = 0 SET @strDeptName = '' ELSE BEGIN SET @strDeptName = (SELECT dName FROM Department WHERE dDeptID = @iDeptID) IF (@strDeptName IS NULL) SET @strDeptName = '' END RETURN @strDeptName END ========================== Thanks in advance.

    Read the article

  • I have data about deadlocks, but I can't understand why they occur (MS SQL/ASP.NET MVC)

    - by Alex
    I am receiving a lot of deadlocks in my big web application. http://stackoverflow.com/questions/2941233/how-to-automatically-re-run-deadlocked-transaction-asp-net-mvc-sql-server Here I wanted to re-run deadlocked transactions, but I was told to get rid of the deadlocks - it's much better, than trying to catch the deadlocks. So I spent the whole day with SQL profiler, setting the tracing keys etc. And this is what I got. There's a Users table. I have a very high usable page with the following query (it's not the only query, but it's the one that causes troubles) UPDATE Users SET views = views + 1 WHERE ID IN (SELECT AuthorID FROM Articles WHERE ArticleID = @ArticleID) And then there's the following query in ALL pages: User = DB.Users.SingleOrDefault(u => u.Password == password && u.Name == username); That's where I get User from cookies. Very often a deadlock occurs and this second LINQ TO SQL query is chosen as a victim, so it's not run, and users of my site see an error screen. I read a lot about deadlocks... And I don't understand why this is causing a deadlock. So obviously both of this queries run very often. At least once a second. Maybe even more often (300-400 users online). So they can be run at the same time very easily, but why does it cause a deadlock? Please help. Thank you

    Read the article

  • What is the best way to sync multiple client SqlServers to one MS SqlServer 2005?

    - by user605055
    I have several client databases that use my windows application. I want to send this data for online web site. The client databases and server database structure is different because we need to add client ID column to some tables in server data base. I use this way to sync databases; use another application and use C# bulk copy with transaction to sync databases. My server database sql server is too busy and parallel task cannot be run. I work on this solution: I use triggers after update, delete, insert to save changes in one table and create sql query to send a web service to sync data. But I must send all data first! Huge data set (bigger than 16mg) I think can't use replication because the structure and primary keys are different.

    Read the article

  • Does the API call ExitProcess() work OK in VB6 if you follow the MS caveats?

    - by Clay Nichols
    Microsoft indicates that VB6 doesn't support ExitProcess (to exit and return a value). However, it indicates that this call can fail under certain circumstances (if a thread hasn't been completed, etc.) so I'm wondering whether this call will work OK (consistently :-) as long as you obey the caveats in the article. I could go a step further and call ExitProcess() from the Sub Main or Form which stared the app. Update: after some more reading (I really did research this a bit before asking ) I found a suggestion to use the TerminateProcess API instead. I'm investigating that option. Any thoughts?

    Read the article

  • CodePlex Daily Summary for Tuesday, October 15, 2013

    CodePlex Daily Summary for Tuesday, October 15, 2013Popular ReleasesFFXIV Crafting Simulator: Crafting Simulator 2.4.1: -Fixed the offset for the new patch (Auto Loading function)iBoxDB.EX - Fast Transactional NoSQL Database Resources: iBoxDB.net fast transactional nosql database 1.5.2: Easily process objects and documents, zero configuration. fast embeddable transactional nosql document database, includes CURD, QueryLanguage, Master-Master-Slave Replication, MVCC, etc. supports .net2, .net4, windows phone, mono, unity3d, node.js , copy and run. http://download-codeplex.sec.s-msft.com/Download?ProjectName=iboxdb&DownloadId=737783 Benchmark with MongoDB Compatibility more platforms for java versionneurogoody: slicebox: this is the slice box jsEvent-Based Components AppBuilder: AB3.AppDesigner.55: Iteration 55 (Feature): Moving of TargetEdge (simple wires only) by mouse.Sandcastle Help File Builder: SHFB v1.9.8.0 with Visual Studio Package: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This new release contains bug fixes and feature enhancements. There are some potential breaking changes in this release as some features of the Help File Builder have been moved into...SharpConfig: SharpConfig 1.2: Implemented comment parsing. Comments are now part of settings and setting categories. New properties: Setting: Comment PreComments SettingCategory: Comment PreCommentsC++ REST SDK (codename "Casablanca"): C++ REST SDK 1.3.0: This release fixes multiple customer reported issues as well as the following: Full support for Dev12 binaries and project files Full support for Windows XP New sample highlighting the Client and Server APIs : BlackJack Expose underlying native handle to set custom options on http_client Improvements to Listener Library Note: Dev10 binaries have been dropped as of this release, however the Dev10 project files are still available in the Source CodeAD ACL Scanner: 1.3.2: Minor bug fixed: Powershell 4.0 will report: Select—Object: Parameter cannot be processed because the parameter name p is ambiguous.Json.NET: Json.NET 5.0 Release 7: New feature - Added support for Immutable Collections New feature - Added WriteData and ReadData settings to DataExtensionAttribute New feature - Added reference and type name handling support to extension data New feature - Added default value and required support to constructor deserialization Change - Extension data is now written when serializing Fix - Added missing casts to JToken Fix - Fixed parsing large floating point numbers Fix - Fixed not parsing some ISO date ...RESX Manager: ResxManager 0.2.1: FIXED: Many critical bugs have been fixed. New Features Error logging for improved exception handling New toolbar Improvements of user interfaceFast YouTube Downloader: YouTube Downloader 2.2.0: YouTube Downloader 2.2.0VidCoder: 1.5.8 Beta: Added hardware acceleration options: Bicubic OpenCL scaling algorithm, QSV decoding/encoding and DXVA decoding. Updated HandBrake core to SVN 5834. Updated VidCoder setup icon. Fixed crash when choosing the mp4v2 container on x86 and opening on x64. Warning: the hardware acceleration features require specific hardware or file types to work correctly: QSV: Need an Intel processor that supports Quick Sync Video encoding, with a monitor hooked up to the Intel HD Graphics output and the lat...ASP.net MVC Awesome - jQuery Ajax Helpers: 3.5.2: version 3.5.2 - fix for setting single value to multivalue controls - datepicker min max date offset fix - html encoding for keys fix - enable Column.ClientFormatFunc to be a function call that will return a function version 3.5.1 - fixed html attributes rendering - fixed loading animation rendering - css improvements version 3.5 ========================== - autosize for all popups ( can be turned off by calling in js awe.autoSize = false ) - added Parent, Paremeter extensions ...Wsus Package Publisher: Release v1.3.1310.12: Allow the Update Creation Wizard to be set in full screen mode. Fix a bug which prevent WPP to Reset Remote Sus Client ID. Change the behavior of links in the Update Detail Viewer. Left-Click to open, Right-Click to copy to the Clipboard.TerrariViewer: TerrariViewer v7 [Terraria Inventory Editor]: This is a complete overhaul but has the same core style. I hope you enjoy it. This version is compatible with 1.2.0.3 Please send issues to my Twitter or https://github.com/TJChap2840WDTVHubGen - Adds Metadata, thumbnails and subtitles to WDTV Live Hubs: WDTVHubGen.v2.1.6.maint: I think this covers all of the issues. new additions: fixed the thumbnail problem for backgrounds. general clean up and error checking. need to get this put through the wringer and all feedback is welcome.BIDS Helper: BIDS Helper 1.6.4: This BIDS Helper release brings the following new features and fixes: New Features: A new Bus Matrix style report option when you run the Printer Friendly Dimension Usage report for an SSAS cube. The Biml engine is now fully in sync with the supported subset of Varigence Mist 3.4. This includes a large number of language enhancements, bugfixes, and project deployment support. Fixed Issues: Fixed Biml execution for project connections fixing a bug with Tabular Translations Editor not a...MoreTerra (Terraria World Viewer): MoreTerra 1.11.3: =========== =New Features= =========== New Markers added for Plantera's Bulb, Heart Fruits and Gold Cache. Markers now correctly display for the gems found in rock debris on the floor. =========== =Compatibility= =========== Fixed header changes found in Terraria 1.0.3.1Media Companion: Media Companion MC3.581b: Fix in place for TVDB xml issue. New* Movie - General Preferences, allow saving of ignored 'The' or 'A' to end of movie title, stored in sorttitle field. * Movie - New Way for Cropping Posters. Fixed* Movie - Rename of folders/filename. caught error message. * Movie - Fixed Bug in Save Cropped image, only saving in Pre-Frodo format if Both model selected. * Movie - Fixed Cropped image didn't take zoomed ratio into effect. * Movie - Separated Folder Renaming and File Renaming fuctions durin...SmartStore.NET - Free ASP.NET MVC Ecommerce Shopping Cart Solution: SmartStore.NET 1.2.0: HighlightsMulti-store support "Trusted Shops" plugins Highly improved SmartStore.biz Importer plugin Add custom HTML content to pages Performance optimization New FeaturesMulti-store-support: now multiple stores can be managed within a single application instance (e.g. for building different catalogs, brands, landing pages etc.) Added 3 new Trusted Shops plugins: Seal, Buyer Protection, Store Reviews Added Display as HTML Widget to CMS Topics (store owner now can add arbitrary HT...New ProjectsArtezio SharePoint 2013 Workflow Activities: SharePoint Workflow 2013 doesn’t provide activities to work with permissions, we've fixed it using HttpSend activity that makes REST API calls.Dependency.Injection: An attempt to write a really simple dependency injection framework. Does property-based and recursive dependency injection. Handles singletons. Yay!DHGMS SUO Killer: SUO Killer is a Visual Studio extension to deal with the removal of SUO file to mitigate SUO related issues in Visual Studio. This project is written in C#.dynamicsheet: dynamicsheetExcel Comparator: Excel Comparator is an add-in for Microsoft Excel that allows the user to compare a range between two sheets. FetchAIP: FetchAIP is a utility to download the various sections of the Aeronautical Information Publication (AIP) for New Zealand.Fluent Method and Type Builder: Still working on the summary.getboost: NuGet package for Boost framework.Goldstone Forum: WebForms Forum - TelerikAcademy Team ProjectGroupMe Software Development Kit: .NET Software Development Kit for http://groupme.com/ chat service.GSLMS: ----Import Excel Files Into SQL Server: Load Excel files into SQL Database without schema changes.Inaction: ?????????? jBegin: Learning ASP.net MVC from beginning, then here will be the source code for jbegin.comKDG's Statistical Quality Control Solver: This tool will include methods that can solve sample standard deviation, sample variance, median, mode, moving average, percentiles, margin of error, etc.kpi: Key Performance Indicator (KPI)????; visual studio 2010 with .NET 4.0 runtimeLECO Remote Control Client Application: Sample code and binaries are provided to demonstrate the remote control capability of a LECO Cornerstone instrument.LinkPad: My first Windows Store app intended for student to sketch up thoughts and concepts in quick diagrams.Modler.NET - Automating Graphical Data Model Co-Evolution: Modler.NET was the tool created for a Master's thesis project, which automates the co-evolution of graphical data models and the database that they represent.MyFileManager1: SummaryNever Lotto: Korean 465 Lotto Analyzer and Simulator. The real purpose of this project is to show that this kind of lotto things are just shit.NHibernate: The purpose of this project is to demo CRUD operations using NHibernate with Mono in Visual Studio 2012 using C# language. OAuth2 Authorizer: OAuth2 Authorizer helps you get the access code for a standard OAuth2 REST service that implements 3-legged authentication.Regular Expression for Excel: Regular Expression For Excel is an Excel Plugin. It provides a regular expressions EXCEL support. We can use it in the EXCEL function.Service Tester: Service Tester is an Azure Cloud based load testing application targeted at Soap Web Services which allows you to invoke your Web Service by random parameters.Simple TypeScript and C# Class Generator: Simple GUI application to generate compatible class source code for C# and TypeScript for communications between C# and TypeScript. Soccer team management: ---Spanner: No more stringly-typed web development! Build statically typed single page web applications in C#, automatically generating all HTML, JavaScript, and Knockout.

    Read the article

  • How to manually patch Blogger template to use Disqus

    - by user317944
    I'm trying to add disqus to my blog and I tried following this guide to do so: http://disqus.com/docs/patch-blogger/ However their instructions are completely off with what I have on my custom template. Here is the template: <b:skin><![CDATA[/*----------------------------------------------- Blogger Template Style Name: Picture Window Designer: Josh Peterson URL: www.noaesthetic.com ----------------------------------------------- */ /* Variable definitions ==================== */ /* Content ----------------------------------------------- */ body { font: $(body.font); color: $(body.text.color); } html body .region-inner { min-width: 0; max-width: 100%; width: auto; } .content-outer { font-size: 90%; } a:link { text-decoration:none; color: $(link.color); } a:visited { text-decoration:none; color: $(link.visited.color); } a:hover { text-decoration:underline; color: $(link.hover.color); } .body-fauxcolumn-outer { background: $(body.background); } .content-outer { background: $(content.background); -moz-border-radius: $(content.border.radius); -webkit-border-radius: $(content.border.radius); -goog-ms-border-radius: $(content.border.radius); border-radius: $(content.border.radius); -moz-box-shadow: 0 0 $(content.shadow.spread) rgba(0, 0, 0, .15); -webkit-box-shadow: 0 0 $(content.shadow.spread) rgba(0, 0, 0, .15); -goog-ms-box-shadow: 0 0 $(content.shadow.spread) rgba(0, 0, 0, .15); box-shadow: 0 0 $(content.shadow.spread) rgba(0, 0, 0, .15); margin: $(content.margin) auto; } .content-inner { padding: $(content.padding); } /* Header ----------------------------------------------- */ .header-outer { background: $(header.background.color) $(header.background.gradient) repeat-x scroll top left; _background-image: none; color: $(header.text.color); -moz-border-radius: $(header.border.radius); -webkit-border-radius: $(header.border.radius); -goog-ms-border-radius: $(header.border.radius); border-radius: $(header.border.radius); } .Header img, .Header #header-inner { -moz-border-radius: $(header.border.radius); -webkit-border-radius: $(header.border.radius); -goog-ms-border-radius: $(header.border.radius); border-radius: $(header.border.radius); } .header-inner .Header .titlewrapper, .header-inner .Header .descriptionwrapper { padding-left: $(header.padding); padding-right: $(header.padding); } .Header h1 { font: $(header.font); text-shadow: 1px 1px 3px rgba(0, 0, 0, 0.3); } .Header h1 a { color: $(header.text.color); } .Header .description { font-size: 130%; } /* Tabs ----------------------------------------------- */ .tabs-inner { margin: .5em $(tabs.margin.sides) $(tabs.margin.bottom); padding: 0; } .tabs-inner .section { margin: 0; } .tabs-inner .widget ul { padding: 0; background: $(tabs.background.color) $(tabs.background.gradient) repeat scroll bottom; -moz-border-radius: $(tabs.border.radius); -webkit-border-radius: $(tabs.border.radius); -goog-ms-border-radius: $(tabs.border.radius); border-radius: $(tabs.border.radius); } .tabs-inner .widget li { border: none; } .tabs-inner .widget li a { display: block; padding: .5em 1em; margin-$endSide: $(tabs.spacing); color: $(tabs.text.color); font: $(tabs.font); -moz-border-radius: $(tab.border.radius) $(tab.border.radius) 0 0; -webkit-border-top-left-radius: $(tab.border.radius); -webkit-border-top-right-radius: $(tab.border.radius); -goog-ms-border-radius: $(tab.border.radius) $(tab.border.radius) 0 0; border-radius: $(tab.border.radius) $(tab.border.radius) 0 0; background: $(tab.background); border-$endSide: 1px solid $(tabs.separator.color); } .tabs-inner .widget li:first-child a { padding-$startSide: 1.25em; -moz-border-radius-top$startSide: $(tab.first.border.radius); -moz-border-radius-bottom$startSide: $(tabs.border.radius); -webkit-border-top-$startSide-radius: $(tab.first.border.radius); -webkit-border-bottom-$startSide-radius: $(tabs.border.radius); -goog-ms-border-top-$startSide-radius: $(tab.first.border.radius); -goog-ms-border-bottom-$startSide-radius: $(tabs.border.radius); border-top-$startSide-radius: $(tab.first.border.radius); border-bottom-$startSide-radius: $(tabs.border.radius); } .tabs-inner .widget li.selected a, .tabs-inner .widget li a:hover { position: relative; z-index: 1; background: $(tabs.selected.background.color) $(tab.selected.background.gradient) repeat scroll bottom; color: $(tabs.selected.text.color); -moz-box-shadow: 0 0 $(region.shadow.spread) rgba(0, 0, 0, .15); -webkit-box-shadow: 0 0 $(region.shadow.spread) rgba(0, 0, 0, .15); -goog-ms-box-shadow: 0 0 $(region.shadow.spread) rgba(0, 0, 0, .15); box-shadow: 0 0 $(region.shadow.spread) rgba(0, 0, 0, .15); } /* Headings ----------------------------------------------- */ h2 { font: $(widget.title.font); text-transform: $(widget.title.text.transform); color: $(widget.title.text.color); margin: .5em 0; } /* Main ----------------------------------------------- */ .main-outer { background: $(main.background); -moz-border-radius: $(main.border.radius.top) $(main.border.radius.top) 0 0; -webkit-border-top-left-radius: $(main.border.radius.top); -webkit-border-top-right-radius: $(main.border.radius.top); -webkit-border-bottom-left-radius: 0; -webkit-border-bottom-right-radius: 0; -goog-ms-border-radius: $(main.border.radius.top) $(main.border.radius.top) 0 0; border-radius: $(main.border.radius.top) $(main.border.radius.top) 0 0; -moz-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); -webkit-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); -goog-ms-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); } .main-inner { padding: 15px $(main.padding.sides) 20px; } .main-inner .column-center-inner { padding: 0 0; } .main-inner .column-left-inner { padding-left: 0; } .main-inner .column-right-inner { padding-right: 0; } /* Posts ----------------------------------------------- */ h3.post-title { margin: 0; font: $(post.title.font); } .comments h4 { margin: 1em 0 0; font: $(post.title.font); } .post-outer { background-color: $(post.background.color); border: solid 1px $(post.border.color); -moz-border-radius: $(post.border.radius); -webkit-border-radius: $(post.border.radius); border-radius: $(post.border.radius); -goog-ms-border-radius: $(post.border.radius); padding: 15px 20px; margin: 0 $(post.margin.sides) 20px; } .post-body { line-height: 1.4; font-size: 110%; } .post-header { margin: 0 0 1.5em; color: $(post.footer.text.color); line-height: 1.6; } .post-footer { margin: .5em 0 0; color: $(post.footer.text.color); line-height: 1.6; } blog-pager { font-size: 140% } comments .comment-author { padding-top: 1.5em; border-top: dashed 1px #ccc; border-top: dashed 1px rgba(128, 128, 128, .5); background-position: 0 1.5em; } comments .comment-author:first-child { padding-top: 0; border-top: none; } .avatar-image-container { margin: .2em 0 0; } /* Widgets ----------------------------------------------- */ .widget ul, .widget #ArchiveList ul.flat { padding: 0; list-style: none; } .widget ul li, .widget #ArchiveList ul.flat li { border-top: dashed 1px #ccc; border-top: dashed 1px rgba(128, 128, 128, .5); } .widget ul li:first-child, .widget #ArchiveList ul.flat li:first-child { border-top: none; } .widget .post-body ul { list-style: disc; } .widget .post-body ul li { border: none; } /* Footer ----------------------------------------------- */ .footer-outer { color:$(footer.text.color); background: $(footer.background); -moz-border-radius: $(footer.border.radius.top) $(footer.border.radius.top) $(footer.border.radius.bottom) $(footer.border.radius.bottom); -webkit-border-top-left-radius: $(footer.border.radius.top); -webkit-border-top-right-radius: $(footer.border.radius.top); -webkit-border-bottom-left-radius: $(footer.border.radius.bottom); -webkit-border-bottom-right-radius: $(footer.border.radius.bottom); -goog-ms-border-radius: $(footer.border.radius.top) $(footer.border.radius.top) $(footer.border.radius.bottom) $(footer.border.radius.bottom); border-radius: $(footer.border.radius.top) $(footer.border.radius.top) $(footer.border.radius.bottom) $(footer.border.radius.bottom); -moz-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); -webkit-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); -goog-ms-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); } .footer-inner { padding: 10px $(main.padding.sides) 20px; } .footer-outer a { color: $(footer.link.color); } .footer-outer a:visited { color: $(footer.link.visited.color); } .footer-outer a:hover { color: $(footer.link.hover.color); } .footer-outer .widget h2 { color: $(footer.widget.title.text.color); } ]] <b:template-skin> <b:variable default='930px' name='content.width' type='length' value='930px'/> <b:variable default='0' name='main.column.left.width' type='length' value='180px'/> <b:variable default='360px' name='main.column.right.width' type='length' value='180px'/> <![CDATA[ body { min-width: $(content.width); } .content-outer, .region-inner { min-width: $(content.width); max-width: $(content.width); _width: $(content.width); } .main-inner .columns { padding-left: $(main.column.left.width); padding-right: $(main.column.right.width); } .main-inner .fauxcolumn-center-outer { left: $(main.column.left.width); right: $(main.column.right.width); /* IE6 does not respect left and right together */ _width: expression(this.parentNode.offsetWidth - parseInt("$(main.column.left.width)") - parseInt("$(main.column.right.width)") + 'px'); } .main-inner .fauxcolumn-left-outer { width: $(main.column.left.width); } .main-inner .fauxcolumn-right-outer { width: $(main.column.right.width); } .main-inner .column-left-outer { width: $(main.column.left.width); right: $(main.column.left.width); margin-right: -$(main.column.left.width); } .main-inner .column-right-outer { width: $(main.column.right.width); margin-right: -$(main.column.right.width); } #layout { min-width: 0; } #layout .content-outer { min-width: 0; width: 800px; } #layout .region-inner { min-width: 0; width: auto; } ]]> </b:template-skin> <div class='main-cap-bottom cap-bottom'> <div class='cap-left'/> <div class='cap-right'/> </div> </div> <footer> <div class='footer-outer'> <div class='footer-cap-top cap-top'> <div class='cap-left'/> <div class='cap-right'/> </div> <div class='fauxborder-left footer-fauxborder-left'> <div class='fauxborder-right footer-fauxborder-right'/> <div class='region-inner footer-inner'> <macro:include id='footer-sections' name='sections'> <macro:param default='2' name='num' value='3'/> <macro:param default='footer' name='idPrefix'/> <macro:param default='foot' name='class'/> <macro:param default='false' name='includeBottom'/> </macro:include> <!-- outside of the include in order to lock Attribution widget --> <b:section class='foot' id='footer-3' showaddelement='no'> document.body.className = document.body.className.replace('loading', ''); <macro:if cond='data:col.num &gt;= 2'> <table border='0' cellpadding='0' cellspacing='0' mexpr:class='&quot;section-columns columns-&quot; + data:col.num'> <tbody> <tr> <td class='first columns-cell'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-2-1&quot;'/> </td> <td class='columns-cell'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-2-2&quot;'/> </td> <macro:if cond='data:col.num &gt;= 3'> <td class='columns-cell'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-2-3&quot;'/> </td> </macro:if> <macro:if cond='data:col.num &gt;= 4'> <td class='columns-cell'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-2-4&quot;'/> </td> </macro:if> </tr> </tbody> </table> <macro:if cond='data:col.includeBottom'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-3&quot;' showaddelement='no'/> </macro:if> </macro

    Read the article

  • C#: LINQ vs foreach - Round 1.

    - by James Michael Hare
    So I was reading Peter Kellner's blog entry on Resharper 5.0 and its LINQ refactoring and thought that was very cool.  But that raised a point I had always been curious about in my head -- which is a better choice: manual foreach loops or LINQ?    The answer is not really clear-cut.  There are two sides to any code cost arguments: performance and maintainability.  The first of these is obvious and quantifiable.  Given any two pieces of code that perform the same function, you can run them side-by-side and see which piece of code performs better.   Unfortunately, this is not always a good measure.  Well written assembly language outperforms well written C++ code, but you lose a lot in maintainability which creates a big techncial debt load that is hard to offset as the application ages.  In contrast, higher level constructs make the code more brief and easier to understand, hence reducing technical cost.   Now, obviously in this case we're not talking two separate languages, we're comparing doing something manually in the language versus using a higher-order set of IEnumerable extensions that are in the System.Linq library.   Well, before we discuss any further, let's look at some sample code and the numbers.  First, let's take a look at the for loop and the LINQ expression.  This is just a simple find comparison:       // find implemented via LINQ     public static bool FindViaLinq(IEnumerable<int> list, int target)     {         return list.Any(item => item == target);     }         // find implemented via standard iteration     public static bool FindViaIteration(IEnumerable<int> list, int target)     {         foreach (var i in list)         {             if (i == target)             {                 return true;             }         }           return false;     }   Okay, looking at this from a maintainability point of view, the Linq expression is definitely more concise (8 lines down to 1) and is very readable in intention.  You don't have to actually analyze the behavior of the loop to determine what it's doing.   So let's take a look at performance metrics from 100,000 iterations of these methods on a List<int> of varying sizes filled with random data.  For this test, we fill a target array with 100,000 random integers and then run the exact same pseudo-random targets through both searches.                       List<T> On 100,000 Iterations     Method      Size     Total (ms)  Per Iteration (ms)  % Slower     Any         10       26          0.00046             30.00%     Iteration   10       20          0.00023             -     Any         100      116         0.00201             18.37%     Iteration   100      98          0.00118             -     Any         1000     1058        0.01853             16.78%     Iteration   1000     906         0.01155             -     Any         10,000   10,383      0.18189             17.41%     Iteration   10,000   8843        0.11362             -     Any         100,000  104,004     1.8297              18.27%     Iteration   100,000  87,941      1.13163             -   The LINQ expression is running about 17% slower for average size collections and worse for smaller collections.  Presumably, this is due to the overhead of the state machine used to track the iterators for the yield returns in the LINQ expressions, which seems about right in a tight loop such as this.   So what about other LINQ expressions?  After all, Any() is one of the more trivial ones.  I decided to try the TakeWhile() algorithm using a Count() to get the position stopped like the sample Pete was using in his blog that Resharper refactored for him into LINQ:       // Linq form     public static int GetTargetPosition1(IEnumerable<int> list, int target)     {         return list.TakeWhile(item => item != target).Count();     }       // traditionally iterative form     public static int GetTargetPosition2(IEnumerable<int> list, int target)     {         int count = 0;           foreach (var i in list)         {             if(i == target)             {                 break;             }               ++count;         }           return count;     }   Once again, the LINQ expression is much shorter, easier to read, and should be easier to maintain over time, reducing the cost of technical debt.  So I ran these through the same test data:                       List<T> On 100,000 Iterations     Method      Size     Total (ms)  Per Iteration (ms)  % Slower     TakeWhile   10       41          0.00041             128%     Iteration   10       18          0.00018             -     TakeWhile   100      171         0.00171             88%     Iteration   100      91          0.00091             -     TakeWhile   1000     1604        0.01604             94%     Iteration   1000     825         0.00825             -     TakeWhile   10,000   15765       0.15765             92%     Iteration   10,000   8204        0.08204             -     TakeWhile   100,000  156950      1.5695              92%     Iteration   100,000  81635       0.81635             -     Wow!  I expected some overhead due to the state machines iterators produce, but 90% slower?  That seems a little heavy to me.  So then I thought, well, what if TakeWhile() is not the right tool for the job?  The problem is TakeWhile returns each item for processing using yield return, whereas our for-loop really doesn't care about the item beyond using it as a stop condition to evaluate. So what if that back and forth with the iterator state machine is the problem?  Well, we can quickly create an (albeit ugly) lambda that uses the Any() along with a count in a closure (if a LINQ guru knows a better way PLEASE let me know!), after all , this is more consistent with what we're trying to do, we're trying to find the first occurence of an item and halt once we find it, we just happen to be counting on the way.  This mostly matches Any().       // a new method that uses linq but evaluates the count in a closure.     public static int TakeWhileViaLinq2(IEnumerable<int> list, int target)     {         int count = 0;         list.Any(item =>             {                 if(item == target)                 {                     return true;                 }                   ++count;                 return false;             });         return count;     }     Now how does this one compare?                         List<T> On 100,000 Iterations     Method         Size     Total (ms)  Per Iteration (ms)  % Slower     TakeWhile      10       41          0.00041             128%     Any w/Closure  10       23          0.00023             28%     Iteration      10       18          0.00018             -     TakeWhile      100      171         0.00171             88%     Any w/Closure  100      116         0.00116             27%     Iteration      100      91          0.00091             -     TakeWhile      1000     1604        0.01604             94%     Any w/Closure  1000     1101        0.01101             33%     Iteration      1000     825         0.00825             -     TakeWhile      10,000   15765       0.15765             92%     Any w/Closure  10,000   10802       0.10802             32%     Iteration      10,000   8204        0.08204             -     TakeWhile      100,000  156950      1.5695              92%     Any w/Closure  100,000  108378      1.08378             33%     Iteration      100,000  81635       0.81635             -     Much better!  It seems that the overhead of TakeAny() returning each item and updating the state in the state machine is drastically reduced by using Any() since Any() iterates forward until it finds the value we're looking for -- for the task we're attempting to do.   So the lesson there is, make sure when you use a LINQ expression you're choosing the best expression for the job, because if you're doing more work than you really need, you'll have a slower algorithm.  But this is true of any choice of algorithm or collection in general.     Even with the Any() with the count in the closure it is still about 30% slower, but let's consider that angle carefully.  For a list of 100,000 items, it was the difference between 1.01 ms and 0.82 ms roughly in a List<T>.  That's really not that bad at all in the grand scheme of things.  Even running at 90% slower with TakeWhile(), for the vast majority of my projects, an extra millisecond to save potential errors in the long term and improve maintainability is a small price to pay.  And if your typical list is 1000 items or less we're talking only microseconds worth of difference.   It's like they say: 90% of your performance bottlenecks are in 2% of your code, so over-optimizing almost never pays off.  So personally, I'll take the LINQ expression wherever I can because they will be easier to read and maintain (thus reducing technical debt) and I can rely on Microsoft's development to have coded and unit tested those algorithm fully for me instead of relying on a developer to code the loop logic correctly.   If something's 90% slower, yes, it's worth keeping in mind, but it's really not until you start get magnitudes-of-order slower (10x, 100x, 1000x) that alarm bells should really go off.  And if I ever do need that last millisecond of performance?  Well then I'll optimize JUST THAT problem spot.  To me it's worth it for the readability, speed-to-market, and maintainability.

    Read the article

  • Data Mining Resources

    - by Dejan Sarka
    There are many different types of analyses, each one with its own pros and cons. Relational reports have a predefined structure, and end users cannot change it. They are simple to use for end users. Reports can use real-time data and snapshots of data to show the state of a report at specific points in time. One of the drawbacks is that report authoring is limited to IT pros and advanced users. Any kind of dynamic restructuring is very limited. If real-time data is used for a report, the report has a negative impact on the performance of the source system. Processing of the reports might be slow because the data comes from relational database management systems, which are not optimized for reporting only. If you create a semantic model of your data, your end users can create ad-hoc report structures. However, the development is more complex because a developer is needed to create these semantic models. For OLAP, you typically use specialized database management systems. You get lightning speed of analyses. End users can use rich and thin clients to interactively change the structure of the report. Typically, they do it graphically. However, the development of an OLAP system is many times quite complex. It involves the preparation and maintenance of an enterprise data warehouse and OLAP cubes. In order to exploit the possibility of real-time restructuring of reports, the users must be both active and educated. The data is usually stale, as it is loaded into data warehouses and OLAP cubes with a scheduled process. With data mining, a structure is not selected in advance; it searches for the structure. As a result, data mining can give you the most valuable results because you can discover patterns you did not expect. A data mining model structure is limited only by the attributes that you use to train the model. One of the drawbacks is that a lot of knowledge is needed for a successful data mining project. End users have to understand the results. Subject matter experts and IT professionals need to understand business problem thoroughly. The development might be sometimes even more complex than the development of OLAP cubes. Each type of analysis has its own place in an enterprise system. SQL Server has tools for all kinds of analyses. However, data mining is the most advanced way of analyzing the data; this is the “I” in BI. In order to get the most out of it, you need to learn quite a lot. In this blog post, I am gathering together resources for learning, including forthcoming events. Books Multiple authors: SQL Server MVP Deep Dives – I wrote an introductory data mining chapter there. Erik Veerman, Teo Lachev and Dejan Sarka: MCTS Self-Paced Training Kit (Exam 70-448): Microsoft SQL Server 2008 - Business Intelligence Development and Maintenance – you can find a good overview of a complete BI solution, including data mining, in this book. Jamie MacLennan, ZhaoHui Tang, and Bogdan Crivat: Data Mining with Microsoft SQL Server 2008 – can’t miss this book if you want to mine your data with SQL Server tools. Michael Berry, Gordon Linoff: Mastering Data Mining: The Art and Science of Customer Relationship Management – data mining from both, business and technical perspective. Dorian Pyle: Data Preparation for Data Mining – an in-depth book about data preparation. Thomas and Ronald Wonnacott: Introductory Statistics – if you thought that you could get away without statistics, then you are not serious about data mining. Jiawei Han and Micheline Kamber: Data Mining Concepts and Techniques – in-depth explanation of the most popular data mining algorithms. Michael Berry and Gordon Linoff: Data Mining Techniques – another book that explains data mining algorithms, more fro a business perspective. Paolo Guidici: Applied Data Mining – very mathematical book, only if you enjoy statistics and mathematics in general. Forthcoming presentations I am presenting two data mining related sessions during the PASS Summit in Charlotte, NC: Wednesday, October 16th, 2013 - Fraud Detection: Notes from the Field – I am showing how to use data mining for a specific business problem. The presentation is based on real-life projects. Friday, October 18th: Excel 2013 Advanced Analytics – I am focusing on Excel Data Mining Add-ins, and how to use them together with Power Pivot and other add-ins. This is the most you can get out of Excel. Sinergija 2013, Belgrade, Serbia Tuesday, October 22nd: Excel 2013 Analytics to the Max – another presentation focusing on the most advanced analytics you can get in Excel. SQL Rally Amsterdam, Netherlands Thursday, November 7th: Advanced Analytics in Excel 2013 – and again I am presenting about data mining in Excel. Why three different titles for the same presentation? I don’t know, I guess I forgot the name I proposed every time right after I sent the proposal. Courses Data Mining with SQL Server 2012 – I wrote a 3-day course for SolidQ. If you are interested in this course, which I could also deliver in a shorter seminar way, you can contact your closes SolidQ subsidiary, or, of course, me directly on addresses [email protected] or [email protected]. This course could also complement the existing courseware portfolio of training providers, which are welcome to contact me as well. OK, now you know: no more excuses, start learning data mining, get the most out of your data

    Read the article

  • Using openrowset to read an Excel file into a temp table; how do I reference that table?

    - by mattstuehler
    I'm trying to write a stored procedure that will read an Excel file into a temp table, then massage some of the data in that table, then insert selected rows from that table into a permanent table. So, it starts like this: SET @SQL = "select * into #mytemptable FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database="+@file+";HDR=YES', 'SELECT * FROM [Sheet1$]')" EXEC (@SQL) That much seems to work. However, if I then try something like this: Select * from #mytemptable I get an error: Invalid object name '#mytemptable' Why isn't #mytemptable recognized? Is there a way to have #mytemptable accessible to the rest of the stored procedure? Many thanks in advance!

    Read the article

  • xaml nested class path designer issue

    - by Vlad Bezden
    Hi, I have nested class public class Enums { public enum WindowModeEnum { Edit, New } } In my xaml I reference code: <Style.Triggers> <DataTrigger Binding="{Binding WindowMode}" Value="{x:Static Types1:Enums+WindowModeEnum.Edit}"> <Setter Property="Visibility" Value="Collapsed" /> </DataTrigger> </Style.Triggers> Code compiles and runs properly, however I can't open xaml code in design window. I am getting following error: Type 'Types1:Enums+WindowModeEnum' was not found. at MS.Internal.Metadata.ExposedTypes.ValueSerializers.StaticMemberDocumentValueSerializer.ConvertToDocumentValue(ITypeMetadata type, String value, IServiceProvider documentServices) at MS.Internal.Design.DocumentModel.DocumentTrees.Markup.XamlMarkupExtensionPropertyBase.get_Value() at MS.Internal.Design.DocumentModel.DocumentTrees.DocumentPropertyWrapper.get_Value() at MS.Internal.Design.DocumentModel.DocumentTrees.InMemory.InMemoryDocumentProperty..ctor(DocumentProperty property, InMemoryDocumentItem item) at MS.Internal.Design.DocumentModel.DocumentTrees.InMemory.InMemoryDocumentItem.SetUpItem(DocumentItem item) Same error exist in VS2008, VS2010. Does anybody has any idea, how to deal with it so I can open window in design mode. Thanks a lot. Sincerely, Vlad.

    Read the article

  • Microsoft Sql Server 2008 R2 System Databases

    For a majority of software developers little time is spent understanding the inner workings of the database management systems (DBMS) they use to store data for their applications.  I personally place myself in this grouping. In my case, I have used various versions of Microsoft’s SQL Server (2000, 2005, and 2008 R2) and just recently learned how valuable they really are when I was preparing to deliver a lecture on "SQL Server 2008 R2, System Databases". Microsoft Sql Server 2008 R2 System DatabasesSo what are system databases in MS SQL Server, and why should I know them? Microsoft uses system databases to support the SQL Server DBMS, much like a developer uses config files or database tables to support an application. These system databases individually provide specific functionality that allows MS SQL Server to function. Name Database File Log File Master master.mdf mastlog.ldf Resource mssqlsystemresource.mdf mssqlsystemresource.ldf Model model.mdf modellog.ldf MSDB msdbdata.mdf msdblog.ldf Distribution distmdl.mdf distmdl.ldf TempDB tempdb.mdf templog.ldf Master DatabaseIf you have used MS SQL Server then you should recognize the Master database especially if you used the SQL Server Management Studio (SSMS) to connect to a user created database. MS SQL Server requires the Master database in order for DBMS to start due to the information that it stores. Examples of data stored in the Master database User Logins Linked Servers Configuration information Information on User Databases Resource DatabaseHonestly, until recently I never knew this database even existed until I started to research SQL Server system databases. The reason for this is due largely to the fact that the resource database is hidden to users. In fact, the database files are stored within the Binn folder instead of the standard MS SQL Server database folder path. This database contains all system objects that can be accessed by all other databases.  In short, this database contains all system views and store procedures that appear in all other user databases regarding system information. One of the many benefits to storing system views and store procedures in a single hidden database is the fact it improves upgrading a SQL Server database; not to mention that maintenance is decreased since only one code base has to be mainlined for all of the system views and procedures. Model DatabaseThe Model database as the name implies is the model for all new databases created by users. This allows for predefining default database objects for all new databases within a MS SQL Server instance. For example, if every database created by a user needs to have an “Audit” table when it is  created then defining the “Audit” table in the model will guarantees that the table will be located in every new database create after the model is altered. MSDB DatabaseThe MSDBdatabase is used by SQL Server Agent, SQL Server Database Mail, SQL Server Service Broker, along with SQL Server. The SQL Server Agent uses this database to store job configurations and SQL job schedules along with SQL Alerts, and Operators. In addition, this database also stores all SQL job parameters along with each job’s execution history.  Finally, this database is also used to store database backup and maintenance plans as well as details pertaining to SQL Log shipping if it is being used. Distribution DatabaseThe Distribution database is only used during replication and stores meta data and history information pertaining to the act of replication data. Furthermore, when transactional replication is used this database also stores information regarding each transaction. It is important to note that replication is not turned on by default in MS SQL Server and that the distribution database is hidden from SSMS. Tempdb DatabaseThe Tempdb as the name implies is used to store temporary data and data objects. Examples of this include temp tables and temp store procedures. It is important to note that when using this database all data and data objects are cleared from this database when SQL Server restarts. This database is also used by SQL Server when it is performing some internal operations. Typically, SQL Server uses this database for the purpose of large sort and index operations. Finally, this database is used to store row versions if row versioning or snapsot isolation transactions are being used by SQL Server. Additionally, I would love to hear from others about their experiences using system databases, tables, and objects in a real world environments.

    Read the article

  • Microsoft Sql Server 2008 R2 System Databases

    For a majority of software developers little time is spent understanding the inner workings of the database management systems (DBMS) they use to store data for their applications.  I personally place myself in this grouping. In my case, I have used various versions of Microsoft’s SQL Server (2000, 2005, and 2008 R2) and just recently learned how valuable they really are when I was preparing to deliver a lecture on "SQL Server 2008 R2, System Databases". Microsoft Sql Server 2008 R2 System DatabasesSo what are system databases in MS SQL Server, and why should I know them? Microsoft uses system databases to support the SQL Server DBMS, much like a developer uses config files or database tables to support an application. These system databases individually provide specific functionality that allows MS SQL Server to function. Name Database File Log File Master master.mdf mastlog.ldf Resource mssqlsystemresource.mdf mssqlsystemresource.ldf Model model.mdf modellog.ldf MSDB msdbdata.mdf msdblog.ldf Distribution distmdl.mdf distmdl.ldf TempDB tempdb.mdf templog.ldf Master DatabaseIf you have used MS SQL Server then you should recognize the Master database especially if you used the SQL Server Management Studio (SSMS) to connect to a user created database. MS SQL Server requires the Master database in order for DBMS to start due to the information that it stores. Examples of data stored in the Master database User Logins Linked Servers Configuration information Information on User Databases Resource DatabaseHonestly, until recently I never knew this database even existed until I started to research SQL Server system databases. The reason for this is due largely to the fact that the resource database is hidden to users. In fact, the database files are stored within the Binn folder instead of the standard MS SQL Server database folder path. This database contains all system objects that can be accessed by all other databases.  In short, this database contains all system views and store procedures that appear in all other user databases regarding system information. One of the many benefits to storing system views and store procedures in a single hidden database is the fact it improves upgrading a SQL Server database; not to mention that maintenance is decreased since only one code base has to be mainlined for all of the system views and procedures. Model DatabaseThe Model database as the name implies is the model for all new databases created by users. This allows for predefining default database objects for all new databases within a MS SQL Server instance. For example, if every database created by a user needs to have an “Audit” table when it is  created then defining the “Audit” table in the model will guarantees that the table will be located in every new database create after the model is altered. MSDB DatabaseThe MSDBdatabase is used by SQL Server Agent, SQL Server Database Mail, SQL Server Service Broker, along with SQL Server. The SQL Server Agent uses this database to store job configurations and SQL job schedules along with SQL Alerts, and Operators. In addition, this database also stores all SQL job parameters along with each job’s execution history.  Finally, this database is also used to store database backup and maintenance plans as well as details pertaining to SQL Log shipping if it is being used. Distribution DatabaseThe Distribution database is only used during replication and stores meta data and history information pertaining to the act of replication data. Furthermore, when transactional replication is used this database also stores information regarding each transaction. It is important to note that replication is not turned on by default in MS SQL Server and that the distribution database is hidden from SSMS. Tempdb DatabaseThe Tempdb as the name implies is used to store temporary data and data objects. Examples of this include temp tables and temp store procedures. It is important to note that when using this database all data and data objects are cleared from this database when SQL Server restarts. This database is also used by SQL Server when it is performing some internal operations. Typically, SQL Server uses this database for the purpose of large sort and index operations. Finally, this database is used to store row versions if row versioning or snapsot isolation transactions are being used by SQL Server. Additionally, I would love to hear from others about their experiences using system databases, tables, and objects in a real world environments.

    Read the article

  • Parameter is not valid when getting image from stream

    - by duka1
    Hi guys, I have this code: MemoryStream ms = new MemoryStream(newbytes, 0, newbytes.Length); ms.Position = 0; ms.Write(newbytes, 0, newbytes.Length); Image img = Image.FromStream(ms); img.Save(@"C:\Users\gsira\Pictures\Blue hills5.jpg"); I get this error at the Image.FromStream(ms) call: System.ArgumentException: Parameter is not valid. at System.Drawing.Image.FromStream(Stream stream, Boolean useEmbeddedColorManagement, Boolean validateIma How can I resolve this? A couple of links which solve this problem (one on an MSDN thread) are broken so I am lost.

    Read the article

  • Why are cookies unrecognized when a link is clicked from an external source (i.e. Excel, Word, etc..

    - by Nick
    I noticed that when a link is clicked externally from the web browser, such as from Excel or Word, that my session cookie is initially unrecognized, even if the link opens up in a new tab of the same browser window. The browser ends up recognizing its cookie eventually, but I am puzzled as to why that initial link from Excel or Word doesn't work. To make it even more challenging, clicking a link works fine from Outlook. Does anybody know why this might be happening? I'm using the Zend Framework with PHP 5.3.

    Read the article

  • Converting Openfire IM datetime values in SQL Server to / from VARCHAR(15) and DATETIME data types

    - by Brian Biales
    A client is using Openfire IM for their users, and would like some custom queries to audit user conversations (which are stored by Openfire in tables in the SQL Server database). Because Openfire supports multiple database servers and multiple platforms, the designers chose to store all date/time stamps in the database as 15 character strings, which get converted to Java Date objects in their code (Openfire is written in Java).  I did some digging around, and, so I don't forget and in case someone else will find this useful, I will put the simple algorithms here for converting back and forth between SQL DATETIME and the Java string representation. The Java string representation is the number of milliseconds since 1/1/1970.  SQL Server's DATETIME is actually represented as a float, the value being the number of days since 1/1/1900, the portion after the decimal point representing the hours/minutes/seconds/milliseconds... as a fractional part of a day.  Try this and you will see this is true:     SELECT CAST(0 AS DATETIME) and you will see it returns the date 1/1/1900. The difference in days between SQL Server's 0 date of 1/1/1900 and the Java representation's 0 date of 1/1/1970 is found easily using the following SQL:   SELECT DATEDIFF(D, '1900-01-01', '1970-01-01') which returns 25567.  There are 25567 days between these dates. So to convert from the Java string to SQL Server's date time, we need to convert the number of milliseconds to a floating point representation of the number of days since 1/1/1970, then add the 25567 to change this to the number of days since 1/1/1900.  To convert to days, you need to divide the number by 1000 ms/s, then by  60 seconds/minute, then by 60 minutes/hour, then by 24 hours/day.  Or simply divide by 1000*60*60*24, or 86400000.   So, to summarize, we need to cast this string as a float, divide by 86400000 milliseconds/day, then add 25567 days, and cast the resulting value to a DateTime.  Here is an example:   DECLARE @tmp as VARCHAR(15)   SET @tmp = '1268231722123'   SELECT @tmp as JavaTime, CAST((CAST(@tmp AS FLOAT) / 86400000) + 25567 AS DATETIME) as SQLTime   To convert from SQL datetime back to the Java time format is not quite as simple, I found, because floats of that size do not convert nicely to strings, they end up in scientific notation using the CONVERT function or CAST function.  But I found a couple ways around that problem. You can convert a date to the number of  seconds since 1/1/1970 very easily using the DATEDIFF function, as this value fits in an Int.  If you don't need to worry about the milliseconds, simply cast this integer as a string, and then concatenate '000' at the end, essentially multiplying this number by 1000, and making it milliseconds since 1/1/1970.  If, however, you do care about the milliseconds, you will need to use DATEPART to get the milliseconds part of the date, cast this integer to a string, and then pad zeros on the left to make sure this is three digits, and concatenate these three digits to the number of seconds string above.  And finally, I discovered by casting to DECIMAL(15,0) then to VARCHAR(15), I avoid the scientific notation issue.  So here are all my examples, pick the one you like best... First, here is the simple approach if you don't care about the milliseconds:   DECLARE @tmp as VARCHAR(15)   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SET @tmp = CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15)) + '000'   SELECT @tmp as JavaTime, @dt as SQLTime If you want to keep the milliseconds:   DECLARE @tmp as VARCHAR(15)   DECLARE @dt as DATETIME   DECLARE @ms as int   SET @dt = '2010-03-10 14:35:22.123'   SET @ms as DATEPART(ms, @dt)   SET @tmp = CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15))           + RIGHT('000' + CAST(@ms AS VARCHAR(3)), 3)   SELECT @tmp as JavaTime, @dt as SQLTime Or, in one fell swoop:   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SELECT @dt as SQLTime     , CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15))           + RIGHT('000' + CAST( DATEPART(ms, @dt) AS VARCHAR(3)), 3) as JavaTime   And finally, a way to simply reverse the math used converting from Java date to SQL date. Note the parenthesis - watch out for operator precedence, you want to subtract, then multiply:   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SELECT @dt as SQLTime     , CAST(CAST((CAST(@dt as Float) - 25567.0) * 86400000.0 as DECIMAL(15,0)) as VARCHAR(15)) as JavaTime Interestingly, I found that converting to SQL Date time can lose some accuracy, when I converted the time above to Java time then converted  that back to DateTime, the number of milliseconds is 120, not 123.  As I am not interested in the milliseconds, this is ok for me.  But you may want to look into using DateTime2 in SQL Server 2008 for more accuracy.

    Read the article

  • SharePoint 2010 Sandboxed solution SPGridView

    - by Steve Clements
    If you didn’t know, you probably will soon, the SPGridView is not available in Sandboxed solutions. To be honest there doesn’t seem to be a great deal of information out there about the whys and what nots, basically its not part of the Sandbox SharePoint API. Of course the error message from SharePoint is about as useful as punch in the face… An unexpected error has been encountered in this Web Part.  Error: A Web Part or Web Form Control on this Page cannot be displayed or imported. You don't have Add and Customize Pages permissions required to perform this action …that’s if you have debug=true set, if not the classic “This webpart cannot be added” !! Love that one! but will a little digging you should find something like this… [TypeLoadException: Could not load  type Microsoft.SharePoint.WebControls.SPGridView from assembly 'Microsoft.SharePoint, Version=14.900.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c'.]   Depending on what you want to do with the SPGridView, this may not help at all.  But I’m looking to inherit the theme of the site and style it accordingly. After spending a bit of time with Chrome’s FireBug I was able to get the required CSS classes.  I created my own class inheriting from GridView (note the lack of a preceding SP!) and simply set the styles in there. Inherit from the standard GridView public class PSGridView : GridView   Set the styles in the contructor… public PSGridView() {     this.CellPadding = 2;     this.CellSpacing = 0;     this.GridLines = GridLines.None;     this.CssClass = "ms-listviewtable";     this.Attributes.Add("style", "border-bottom-style: none; border-right-style: none; width: 100%; border-collapse: collapse; border-top-style: none; border-left-style: none;");       this.HeaderStyle.CssClass = "ms-viewheadertr";          this.RowStyle.CssClass = "ms-itmhover";     this.SelectedRowStyle.CssClass = "s4-itm-selected";     this.RowStyle.Height = new Unit(25); }   Then as you cant override the Columns property setter, a custom method to add the column and set the style… public PSGridView() {     this.CellPadding = 2;     this.CellSpacing = 0;     this.GridLines = GridLines.None;     this.CssClass = "ms-listviewtable";     this.Attributes.Add("style", "border-bottom-style: none; border-right-style: none; width: 100%; border-collapse: collapse; border-top-style: none; border-left-style: none;");       this.HeaderStyle.CssClass = "ms-viewheadertr";          this.RowStyle.CssClass = "ms-itmhover";     this.SelectedRowStyle.CssClass = "s4-itm-selected";     this.RowStyle.Height = new Unit(25); }   And that should be enough to get the nicely styled SPGridView without the need for the SPGridView, but seriously….get the SPGridView in the SandBox!!!   Technorati Tags: Sharepoint 2010,SPGridView,Sandbox Solutions,Sandbox Technorati Tags: Sharepoint 2010,SPGridView,Sandbox Solutions,Sandbox

    Read the article

  • Wireless connection works but the internet is too slow to use in Ubuntu 11.04

    - by Garrin
    The internet is so slow as to be unusable. And I'm not being picky. Even after minutes I can't get my Google home page to load. I tried installing a package through apt-get and was getting rates between 0 and a few hundred bytes/s. That's bytes, not kilobytes! Mostly 0 however (no exaggeration, it spends large amounts of time stalled). And I would go to a speed test web site of some kind but I can't since nothing will load. Briefly put, the laptop I am using was connected to two wireless networks while using Ubuntu 11.04 without any issues before this. It was also connected to a wired network without any issues. It dual boats Windows 7 which has never had any issues, not even with the current wireless network. Just to be clear, on the current wi-fi network, Windows 7 encounters no issues (speedtest.net puts the network speed at 1mb/s) but my network connection in Ubuntu 11.04 is so slow as to literally be unusable. I am unfamiliar with the router except for the fact that it boasts a Rogers logo (that's a large ISP/cable provider in Canada for those not familiar with the land of igloos and polar bears). I am far from the router and some desktop widget I use tells me the signal strength is at 58% (it seems fairly reliable and this would appear to match up with the filled bars in the network icon). I should also mention I'm just renting a room in this house so I'm not the network administrator and while I can access the 192.168.0.1 router page, the password wasn't set to 'password' so it's not much use to me. Here are a bunch of commands I ran which don't tell me a whole lot but I thought might be more instructive to the wise around here: lspci (just showing my network card): 05:00.0 Network controller: Atheros Communications Inc. AR928X Wireless Network Adapter (PCI-Express) (rev 01) This one is self explanatory. PING www.googele.com (216.65.41.185) 56(84) bytes of data. 64 bytes from nnw.net (216.65.41.185): icmp_req=1 ttl=51 time=267 ms 64 bytes from nnw.net (216.65.41.185): icmp_req=2 ttl=51 time=190 ms 64 bytes from nnw.net (216.65.41.185): icmp_req=3 ttl=51 time=212 ms 64 bytes from nnw.net (216.65.41.185): icmp_req=4 ttl=51 time=207 ms 64 bytes from nnw.net (216.65.41.185): icmp_req=5 ttl=51 time=220 ms --- www.googele.com ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 4003ms rtt min/avg/max/mdev = 190.079/219.699/267.963/26.121 ms ifconfig eth0 Link encap:Ethernet HWaddr 20:6a:8a:02:20:da UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:42 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:16 errors:0 dropped:0 overruns:0 frame:0 TX packets:16 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:960 (960.0 B) TX bytes:960 (960.0 B) wlan0 Link encap:Ethernet HWaddr 20:7c:8f:05:c6:bf inet addr:192.168.0.16 Bcast:192.168.0.255 Mask:255.255.255.0 inet6 addr: fe80::227c:8fff:fe05:c6bf/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:982 errors:0 dropped:0 overruns:0 frame:0 TX packets:658 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:497250 (497.2 KB) TX bytes:95076 (95.0 KB) Thank you

    Read the article

  • Understanding WordProcessingML tags and avoid unnecessary tags

    - by rithanyalaxmi
    Hi, I am using MS Word API to generate .docx which contains the data fetched from DB, in which i am applying the respective styles, fonts, symbols, etc. If the data fetched from the DB is quite huge, then there is a problem in displaying those data in the .docx file. I found that internally MS Word 2007 will write some content through tags which may not be needed to display the data. Hence i am figuring out what are the necessary MS Word tags needed when converting into a .xml file. So that i can avoid unnecessary tags and build only the respective tags which are needed to display the data. Hence i am planning to write my own .xml with the MS Word tags which are needed, than generating a .XML from .docx file My queries are:- 1) Whether it is right that the MS Word will generate some tags which may not be needed during the conversion of .docx to document.xml? That makes it heavy? If so what are the tags , so that i can avoid them when write by own .xml file. 2) Please send links to understand about the MS Word tags and its advantages, which tags are needed and which are not ? 3) Whether my approach to write a new .xml similar to document.xml (.docx conversion) is worthy one to go forward so that i can build the .xml with the tags i needed , so that i can improve the performance of the data display? Please shed some light into it and thanks in advance.. Thanks, Rithu

    Read the article

< Previous Page | 186 187 188 189 190 191 192 193 194 195 196 197  | Next Page >