Search Results

Search found 6516 results on 261 pages for 'sanal ms'.

Page 89/261 | < Previous Page | 85 86 87 88 89 90 91 92 93 94 95 96  | Next Page >

  • How to synchronize two (or n) replication processes for MS SQL databases?

    - by Yauheni Sivukha
    There are two master databases and two read-only copies updated by standard transactional replication. It is needed to map some entity from both read-only databases, lets say that A databases contains orders and B databases contains lines. The problem is that replication to one database can lag behind replication of second database, and at the moment of mapping R-databases will have inconsistent data. For example. We stored 2 orders with lines at 19:00 and 19:03. Mapping process started at 19:05, but to the moment of mapping A database replication processed all changes up to 19:03, but B database replication processed only changes up to 19:00. After mapping we will have order entity with order as of 19:03 and lines as of 19:00. The troubles are guaranteed:) In my particular case both databases have temporal model, so it is possible to fetch data for every time slice, but the problem is to identify time of latest replication. Question: How to synchronize replication processes for several databases to avoid situation described above?

    Read the article

  • Is there a way to customise messages produced by statements in MS SQL Query Analyzer?

    - by Scott Leis
    If I run a simple query in SQL Query Analyzer, like: SELECT * FROM TableName the Messages pane always produces a message like: (30 row(s) affected) If I run a stored procedure with many statements, the messages are useless because there's no indication of what each one relates to. So firstly: Is there a way to customise the default messages on a per-query basis? E.g. I'd like a specific query to produce a message like: TableName query produced [numRowsAffected] results. replacing [numRowsAffected] with the number that would have appeared in the default message. Secondly, is there a way to suppress the default messages on a per-query basis? E.g. I have a local variable of type TABLE, used in several statements. I don't want any message to appear for statements where I'm just deleting data from that variable before re-using it. I'm seeking solutions that work in SQL Server 8.0.

    Read the article

  • Using parameterized function calls in SELECT statements. MS SQL Server

    - by geekzlla
    I have taken over some code from a previous developer and have come across this SQL statement that calls several SQL functions. As you can see, the function calls in the select statement pass a parameter to the function. How does the SQL statement know what value to replace the variable with? For the below sample, how does the query engine know what to replace nDeptID with when it calls, fn_SelDeptName_DeptID(nDeptID)? nDeptID IS a column in table Note. SELECT STATEMENT: SELECT nCustomerID AS [Customer ID], nJobID AS [Job ID], dbo.fn_SelDeptName_DeptID(nDeptID) AS Department, nJobTaskID AS JobTaskID, dbo.fn_SelDeptTaskDesc_OpenTask(nJobID, nJobTaskID) AS Task, nStandardNoteID AS StandardNoteID, dbo.fn_SelNoteTypeDesc(nNoteID) AS [Note Type], dbo.fn_SelGPAStandardNote(nStandardNoteID) AS [Standard Note], nEntryDate AS [Entry Date], nUserName as [Added By], nType AS Type, nNote AS Note FROM Note WHERE nJobID = 844261 xORDER BY nJobID, Task, [Entry Date] ====================== Function fn_SelDeptName_DeptID: ALTER FUNCTION [dbo].[fn_SelDeptName_DeptID] (@iDeptID int) RETURNS varchar(25) -- Used by DataCollection for Job Tracking -- if the Deptartment isnt found return an empty string BEGIN -- Return the Department name for the given DeptID. DECLARE @strDeptName varchar(25) IF @iDeptID = 0 SET @strDeptName = '' ELSE BEGIN SET @strDeptName = (SELECT dName FROM Department WHERE dDeptID = @iDeptID) IF (@strDeptName IS NULL) SET @strDeptName = '' END RETURN @strDeptName END ========================== Thanks in advance.

    Read the article

  • I have data about deadlocks, but I can't understand why they occur (MS SQL/ASP.NET MVC)

    - by Alex
    I am receiving a lot of deadlocks in my big web application. http://stackoverflow.com/questions/2941233/how-to-automatically-re-run-deadlocked-transaction-asp-net-mvc-sql-server Here I wanted to re-run deadlocked transactions, but I was told to get rid of the deadlocks - it's much better, than trying to catch the deadlocks. So I spent the whole day with SQL profiler, setting the tracing keys etc. And this is what I got. There's a Users table. I have a very high usable page with the following query (it's not the only query, but it's the one that causes troubles) UPDATE Users SET views = views + 1 WHERE ID IN (SELECT AuthorID FROM Articles WHERE ArticleID = @ArticleID) And then there's the following query in ALL pages: User = DB.Users.SingleOrDefault(u => u.Password == password && u.Name == username); That's where I get User from cookies. Very often a deadlock occurs and this second LINQ TO SQL query is chosen as a victim, so it's not run, and users of my site see an error screen. I read a lot about deadlocks... And I don't understand why this is causing a deadlock. So obviously both of this queries run very often. At least once a second. Maybe even more often (300-400 users online). So they can be run at the same time very easily, but why does it cause a deadlock? Please help. Thank you

    Read the article

  • What is the best way to sync multiple client SqlServers to one MS SqlServer 2005?

    - by user605055
    I have several client databases that use my windows application. I want to send this data for online web site. The client databases and server database structure is different because we need to add client ID column to some tables in server data base. I use this way to sync databases; use another application and use C# bulk copy with transaction to sync databases. My server database sql server is too busy and parallel task cannot be run. I work on this solution: I use triggers after update, delete, insert to save changes in one table and create sql query to send a web service to sync data. But I must send all data first! Huge data set (bigger than 16mg) I think can't use replication because the structure and primary keys are different.

    Read the article

  • Does the API call ExitProcess() work OK in VB6 if you follow the MS caveats?

    - by Clay Nichols
    Microsoft indicates that VB6 doesn't support ExitProcess (to exit and return a value). However, it indicates that this call can fail under certain circumstances (if a thread hasn't been completed, etc.) so I'm wondering whether this call will work OK (consistently :-) as long as you obey the caveats in the article. I could go a step further and call ExitProcess() from the Sub Main or Form which stared the app. Update: after some more reading (I really did research this a bit before asking ) I found a suggestion to use the TerminateProcess API instead. I'm investigating that option. Any thoughts?

    Read the article

  • How to manually patch Blogger template to use Disqus

    - by user317944
    I'm trying to add disqus to my blog and I tried following this guide to do so: http://disqus.com/docs/patch-blogger/ However their instructions are completely off with what I have on my custom template. Here is the template: <b:skin><![CDATA[/*----------------------------------------------- Blogger Template Style Name: Picture Window Designer: Josh Peterson URL: www.noaesthetic.com ----------------------------------------------- */ /* Variable definitions ==================== */ /* Content ----------------------------------------------- */ body { font: $(body.font); color: $(body.text.color); } html body .region-inner { min-width: 0; max-width: 100%; width: auto; } .content-outer { font-size: 90%; } a:link { text-decoration:none; color: $(link.color); } a:visited { text-decoration:none; color: $(link.visited.color); } a:hover { text-decoration:underline; color: $(link.hover.color); } .body-fauxcolumn-outer { background: $(body.background); } .content-outer { background: $(content.background); -moz-border-radius: $(content.border.radius); -webkit-border-radius: $(content.border.radius); -goog-ms-border-radius: $(content.border.radius); border-radius: $(content.border.radius); -moz-box-shadow: 0 0 $(content.shadow.spread) rgba(0, 0, 0, .15); -webkit-box-shadow: 0 0 $(content.shadow.spread) rgba(0, 0, 0, .15); -goog-ms-box-shadow: 0 0 $(content.shadow.spread) rgba(0, 0, 0, .15); box-shadow: 0 0 $(content.shadow.spread) rgba(0, 0, 0, .15); margin: $(content.margin) auto; } .content-inner { padding: $(content.padding); } /* Header ----------------------------------------------- */ .header-outer { background: $(header.background.color) $(header.background.gradient) repeat-x scroll top left; _background-image: none; color: $(header.text.color); -moz-border-radius: $(header.border.radius); -webkit-border-radius: $(header.border.radius); -goog-ms-border-radius: $(header.border.radius); border-radius: $(header.border.radius); } .Header img, .Header #header-inner { -moz-border-radius: $(header.border.radius); -webkit-border-radius: $(header.border.radius); -goog-ms-border-radius: $(header.border.radius); border-radius: $(header.border.radius); } .header-inner .Header .titlewrapper, .header-inner .Header .descriptionwrapper { padding-left: $(header.padding); padding-right: $(header.padding); } .Header h1 { font: $(header.font); text-shadow: 1px 1px 3px rgba(0, 0, 0, 0.3); } .Header h1 a { color: $(header.text.color); } .Header .description { font-size: 130%; } /* Tabs ----------------------------------------------- */ .tabs-inner { margin: .5em $(tabs.margin.sides) $(tabs.margin.bottom); padding: 0; } .tabs-inner .section { margin: 0; } .tabs-inner .widget ul { padding: 0; background: $(tabs.background.color) $(tabs.background.gradient) repeat scroll bottom; -moz-border-radius: $(tabs.border.radius); -webkit-border-radius: $(tabs.border.radius); -goog-ms-border-radius: $(tabs.border.radius); border-radius: $(tabs.border.radius); } .tabs-inner .widget li { border: none; } .tabs-inner .widget li a { display: block; padding: .5em 1em; margin-$endSide: $(tabs.spacing); color: $(tabs.text.color); font: $(tabs.font); -moz-border-radius: $(tab.border.radius) $(tab.border.radius) 0 0; -webkit-border-top-left-radius: $(tab.border.radius); -webkit-border-top-right-radius: $(tab.border.radius); -goog-ms-border-radius: $(tab.border.radius) $(tab.border.radius) 0 0; border-radius: $(tab.border.radius) $(tab.border.radius) 0 0; background: $(tab.background); border-$endSide: 1px solid $(tabs.separator.color); } .tabs-inner .widget li:first-child a { padding-$startSide: 1.25em; -moz-border-radius-top$startSide: $(tab.first.border.radius); -moz-border-radius-bottom$startSide: $(tabs.border.radius); -webkit-border-top-$startSide-radius: $(tab.first.border.radius); -webkit-border-bottom-$startSide-radius: $(tabs.border.radius); -goog-ms-border-top-$startSide-radius: $(tab.first.border.radius); -goog-ms-border-bottom-$startSide-radius: $(tabs.border.radius); border-top-$startSide-radius: $(tab.first.border.radius); border-bottom-$startSide-radius: $(tabs.border.radius); } .tabs-inner .widget li.selected a, .tabs-inner .widget li a:hover { position: relative; z-index: 1; background: $(tabs.selected.background.color) $(tab.selected.background.gradient) repeat scroll bottom; color: $(tabs.selected.text.color); -moz-box-shadow: 0 0 $(region.shadow.spread) rgba(0, 0, 0, .15); -webkit-box-shadow: 0 0 $(region.shadow.spread) rgba(0, 0, 0, .15); -goog-ms-box-shadow: 0 0 $(region.shadow.spread) rgba(0, 0, 0, .15); box-shadow: 0 0 $(region.shadow.spread) rgba(0, 0, 0, .15); } /* Headings ----------------------------------------------- */ h2 { font: $(widget.title.font); text-transform: $(widget.title.text.transform); color: $(widget.title.text.color); margin: .5em 0; } /* Main ----------------------------------------------- */ .main-outer { background: $(main.background); -moz-border-radius: $(main.border.radius.top) $(main.border.radius.top) 0 0; -webkit-border-top-left-radius: $(main.border.radius.top); -webkit-border-top-right-radius: $(main.border.radius.top); -webkit-border-bottom-left-radius: 0; -webkit-border-bottom-right-radius: 0; -goog-ms-border-radius: $(main.border.radius.top) $(main.border.radius.top) 0 0; border-radius: $(main.border.radius.top) $(main.border.radius.top) 0 0; -moz-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); -webkit-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); -goog-ms-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); } .main-inner { padding: 15px $(main.padding.sides) 20px; } .main-inner .column-center-inner { padding: 0 0; } .main-inner .column-left-inner { padding-left: 0; } .main-inner .column-right-inner { padding-right: 0; } /* Posts ----------------------------------------------- */ h3.post-title { margin: 0; font: $(post.title.font); } .comments h4 { margin: 1em 0 0; font: $(post.title.font); } .post-outer { background-color: $(post.background.color); border: solid 1px $(post.border.color); -moz-border-radius: $(post.border.radius); -webkit-border-radius: $(post.border.radius); border-radius: $(post.border.radius); -goog-ms-border-radius: $(post.border.radius); padding: 15px 20px; margin: 0 $(post.margin.sides) 20px; } .post-body { line-height: 1.4; font-size: 110%; } .post-header { margin: 0 0 1.5em; color: $(post.footer.text.color); line-height: 1.6; } .post-footer { margin: .5em 0 0; color: $(post.footer.text.color); line-height: 1.6; } blog-pager { font-size: 140% } comments .comment-author { padding-top: 1.5em; border-top: dashed 1px #ccc; border-top: dashed 1px rgba(128, 128, 128, .5); background-position: 0 1.5em; } comments .comment-author:first-child { padding-top: 0; border-top: none; } .avatar-image-container { margin: .2em 0 0; } /* Widgets ----------------------------------------------- */ .widget ul, .widget #ArchiveList ul.flat { padding: 0; list-style: none; } .widget ul li, .widget #ArchiveList ul.flat li { border-top: dashed 1px #ccc; border-top: dashed 1px rgba(128, 128, 128, .5); } .widget ul li:first-child, .widget #ArchiveList ul.flat li:first-child { border-top: none; } .widget .post-body ul { list-style: disc; } .widget .post-body ul li { border: none; } /* Footer ----------------------------------------------- */ .footer-outer { color:$(footer.text.color); background: $(footer.background); -moz-border-radius: $(footer.border.radius.top) $(footer.border.radius.top) $(footer.border.radius.bottom) $(footer.border.radius.bottom); -webkit-border-top-left-radius: $(footer.border.radius.top); -webkit-border-top-right-radius: $(footer.border.radius.top); -webkit-border-bottom-left-radius: $(footer.border.radius.bottom); -webkit-border-bottom-right-radius: $(footer.border.radius.bottom); -goog-ms-border-radius: $(footer.border.radius.top) $(footer.border.radius.top) $(footer.border.radius.bottom) $(footer.border.radius.bottom); border-radius: $(footer.border.radius.top) $(footer.border.radius.top) $(footer.border.radius.bottom) $(footer.border.radius.bottom); -moz-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); -webkit-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); -goog-ms-box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); box-shadow: 0 $(region.shadow.offset) $(region.shadow.spread) rgba(0, 0, 0, .15); } .footer-inner { padding: 10px $(main.padding.sides) 20px; } .footer-outer a { color: $(footer.link.color); } .footer-outer a:visited { color: $(footer.link.visited.color); } .footer-outer a:hover { color: $(footer.link.hover.color); } .footer-outer .widget h2 { color: $(footer.widget.title.text.color); } ]] <b:template-skin> <b:variable default='930px' name='content.width' type='length' value='930px'/> <b:variable default='0' name='main.column.left.width' type='length' value='180px'/> <b:variable default='360px' name='main.column.right.width' type='length' value='180px'/> <![CDATA[ body { min-width: $(content.width); } .content-outer, .region-inner { min-width: $(content.width); max-width: $(content.width); _width: $(content.width); } .main-inner .columns { padding-left: $(main.column.left.width); padding-right: $(main.column.right.width); } .main-inner .fauxcolumn-center-outer { left: $(main.column.left.width); right: $(main.column.right.width); /* IE6 does not respect left and right together */ _width: expression(this.parentNode.offsetWidth - parseInt("$(main.column.left.width)") - parseInt("$(main.column.right.width)") + 'px'); } .main-inner .fauxcolumn-left-outer { width: $(main.column.left.width); } .main-inner .fauxcolumn-right-outer { width: $(main.column.right.width); } .main-inner .column-left-outer { width: $(main.column.left.width); right: $(main.column.left.width); margin-right: -$(main.column.left.width); } .main-inner .column-right-outer { width: $(main.column.right.width); margin-right: -$(main.column.right.width); } #layout { min-width: 0; } #layout .content-outer { min-width: 0; width: 800px; } #layout .region-inner { min-width: 0; width: auto; } ]]> </b:template-skin> <div class='main-cap-bottom cap-bottom'> <div class='cap-left'/> <div class='cap-right'/> </div> </div> <footer> <div class='footer-outer'> <div class='footer-cap-top cap-top'> <div class='cap-left'/> <div class='cap-right'/> </div> <div class='fauxborder-left footer-fauxborder-left'> <div class='fauxborder-right footer-fauxborder-right'/> <div class='region-inner footer-inner'> <macro:include id='footer-sections' name='sections'> <macro:param default='2' name='num' value='3'/> <macro:param default='footer' name='idPrefix'/> <macro:param default='foot' name='class'/> <macro:param default='false' name='includeBottom'/> </macro:include> <!-- outside of the include in order to lock Attribution widget --> <b:section class='foot' id='footer-3' showaddelement='no'> document.body.className = document.body.className.replace('loading', ''); <macro:if cond='data:col.num &gt;= 2'> <table border='0' cellpadding='0' cellspacing='0' mexpr:class='&quot;section-columns columns-&quot; + data:col.num'> <tbody> <tr> <td class='first columns-cell'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-2-1&quot;'/> </td> <td class='columns-cell'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-2-2&quot;'/> </td> <macro:if cond='data:col.num &gt;= 3'> <td class='columns-cell'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-2-3&quot;'/> </td> </macro:if> <macro:if cond='data:col.num &gt;= 4'> <td class='columns-cell'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-2-4&quot;'/> </td> </macro:if> </tr> </tbody> </table> <macro:if cond='data:col.includeBottom'> <b:section mexpr:class='data:col.class' mexpr:id='data:col.idPrefix + &quot;-3&quot;' showaddelement='no'/> </macro:if> </macro

    Read the article

  • C#: LINQ vs foreach - Round 1.

    - by James Michael Hare
    So I was reading Peter Kellner's blog entry on Resharper 5.0 and its LINQ refactoring and thought that was very cool.  But that raised a point I had always been curious about in my head -- which is a better choice: manual foreach loops or LINQ?    The answer is not really clear-cut.  There are two sides to any code cost arguments: performance and maintainability.  The first of these is obvious and quantifiable.  Given any two pieces of code that perform the same function, you can run them side-by-side and see which piece of code performs better.   Unfortunately, this is not always a good measure.  Well written assembly language outperforms well written C++ code, but you lose a lot in maintainability which creates a big techncial debt load that is hard to offset as the application ages.  In contrast, higher level constructs make the code more brief and easier to understand, hence reducing technical cost.   Now, obviously in this case we're not talking two separate languages, we're comparing doing something manually in the language versus using a higher-order set of IEnumerable extensions that are in the System.Linq library.   Well, before we discuss any further, let's look at some sample code and the numbers.  First, let's take a look at the for loop and the LINQ expression.  This is just a simple find comparison:       // find implemented via LINQ     public static bool FindViaLinq(IEnumerable<int> list, int target)     {         return list.Any(item => item == target);     }         // find implemented via standard iteration     public static bool FindViaIteration(IEnumerable<int> list, int target)     {         foreach (var i in list)         {             if (i == target)             {                 return true;             }         }           return false;     }   Okay, looking at this from a maintainability point of view, the Linq expression is definitely more concise (8 lines down to 1) and is very readable in intention.  You don't have to actually analyze the behavior of the loop to determine what it's doing.   So let's take a look at performance metrics from 100,000 iterations of these methods on a List<int> of varying sizes filled with random data.  For this test, we fill a target array with 100,000 random integers and then run the exact same pseudo-random targets through both searches.                       List<T> On 100,000 Iterations     Method      Size     Total (ms)  Per Iteration (ms)  % Slower     Any         10       26          0.00046             30.00%     Iteration   10       20          0.00023             -     Any         100      116         0.00201             18.37%     Iteration   100      98          0.00118             -     Any         1000     1058        0.01853             16.78%     Iteration   1000     906         0.01155             -     Any         10,000   10,383      0.18189             17.41%     Iteration   10,000   8843        0.11362             -     Any         100,000  104,004     1.8297              18.27%     Iteration   100,000  87,941      1.13163             -   The LINQ expression is running about 17% slower for average size collections and worse for smaller collections.  Presumably, this is due to the overhead of the state machine used to track the iterators for the yield returns in the LINQ expressions, which seems about right in a tight loop such as this.   So what about other LINQ expressions?  After all, Any() is one of the more trivial ones.  I decided to try the TakeWhile() algorithm using a Count() to get the position stopped like the sample Pete was using in his blog that Resharper refactored for him into LINQ:       // Linq form     public static int GetTargetPosition1(IEnumerable<int> list, int target)     {         return list.TakeWhile(item => item != target).Count();     }       // traditionally iterative form     public static int GetTargetPosition2(IEnumerable<int> list, int target)     {         int count = 0;           foreach (var i in list)         {             if(i == target)             {                 break;             }               ++count;         }           return count;     }   Once again, the LINQ expression is much shorter, easier to read, and should be easier to maintain over time, reducing the cost of technical debt.  So I ran these through the same test data:                       List<T> On 100,000 Iterations     Method      Size     Total (ms)  Per Iteration (ms)  % Slower     TakeWhile   10       41          0.00041             128%     Iteration   10       18          0.00018             -     TakeWhile   100      171         0.00171             88%     Iteration   100      91          0.00091             -     TakeWhile   1000     1604        0.01604             94%     Iteration   1000     825         0.00825             -     TakeWhile   10,000   15765       0.15765             92%     Iteration   10,000   8204        0.08204             -     TakeWhile   100,000  156950      1.5695              92%     Iteration   100,000  81635       0.81635             -     Wow!  I expected some overhead due to the state machines iterators produce, but 90% slower?  That seems a little heavy to me.  So then I thought, well, what if TakeWhile() is not the right tool for the job?  The problem is TakeWhile returns each item for processing using yield return, whereas our for-loop really doesn't care about the item beyond using it as a stop condition to evaluate. So what if that back and forth with the iterator state machine is the problem?  Well, we can quickly create an (albeit ugly) lambda that uses the Any() along with a count in a closure (if a LINQ guru knows a better way PLEASE let me know!), after all , this is more consistent with what we're trying to do, we're trying to find the first occurence of an item and halt once we find it, we just happen to be counting on the way.  This mostly matches Any().       // a new method that uses linq but evaluates the count in a closure.     public static int TakeWhileViaLinq2(IEnumerable<int> list, int target)     {         int count = 0;         list.Any(item =>             {                 if(item == target)                 {                     return true;                 }                   ++count;                 return false;             });         return count;     }     Now how does this one compare?                         List<T> On 100,000 Iterations     Method         Size     Total (ms)  Per Iteration (ms)  % Slower     TakeWhile      10       41          0.00041             128%     Any w/Closure  10       23          0.00023             28%     Iteration      10       18          0.00018             -     TakeWhile      100      171         0.00171             88%     Any w/Closure  100      116         0.00116             27%     Iteration      100      91          0.00091             -     TakeWhile      1000     1604        0.01604             94%     Any w/Closure  1000     1101        0.01101             33%     Iteration      1000     825         0.00825             -     TakeWhile      10,000   15765       0.15765             92%     Any w/Closure  10,000   10802       0.10802             32%     Iteration      10,000   8204        0.08204             -     TakeWhile      100,000  156950      1.5695              92%     Any w/Closure  100,000  108378      1.08378             33%     Iteration      100,000  81635       0.81635             -     Much better!  It seems that the overhead of TakeAny() returning each item and updating the state in the state machine is drastically reduced by using Any() since Any() iterates forward until it finds the value we're looking for -- for the task we're attempting to do.   So the lesson there is, make sure when you use a LINQ expression you're choosing the best expression for the job, because if you're doing more work than you really need, you'll have a slower algorithm.  But this is true of any choice of algorithm or collection in general.     Even with the Any() with the count in the closure it is still about 30% slower, but let's consider that angle carefully.  For a list of 100,000 items, it was the difference between 1.01 ms and 0.82 ms roughly in a List<T>.  That's really not that bad at all in the grand scheme of things.  Even running at 90% slower with TakeWhile(), for the vast majority of my projects, an extra millisecond to save potential errors in the long term and improve maintainability is a small price to pay.  And if your typical list is 1000 items or less we're talking only microseconds worth of difference.   It's like they say: 90% of your performance bottlenecks are in 2% of your code, so over-optimizing almost never pays off.  So personally, I'll take the LINQ expression wherever I can because they will be easier to read and maintain (thus reducing technical debt) and I can rely on Microsoft's development to have coded and unit tested those algorithm fully for me instead of relying on a developer to code the loop logic correctly.   If something's 90% slower, yes, it's worth keeping in mind, but it's really not until you start get magnitudes-of-order slower (10x, 100x, 1000x) that alarm bells should really go off.  And if I ever do need that last millisecond of performance?  Well then I'll optimize JUST THAT problem spot.  To me it's worth it for the readability, speed-to-market, and maintainability.

    Read the article

  • xaml nested class path designer issue

    - by Vlad Bezden
    Hi, I have nested class public class Enums { public enum WindowModeEnum { Edit, New } } In my xaml I reference code: <Style.Triggers> <DataTrigger Binding="{Binding WindowMode}" Value="{x:Static Types1:Enums+WindowModeEnum.Edit}"> <Setter Property="Visibility" Value="Collapsed" /> </DataTrigger> </Style.Triggers> Code compiles and runs properly, however I can't open xaml code in design window. I am getting following error: Type 'Types1:Enums+WindowModeEnum' was not found. at MS.Internal.Metadata.ExposedTypes.ValueSerializers.StaticMemberDocumentValueSerializer.ConvertToDocumentValue(ITypeMetadata type, String value, IServiceProvider documentServices) at MS.Internal.Design.DocumentModel.DocumentTrees.Markup.XamlMarkupExtensionPropertyBase.get_Value() at MS.Internal.Design.DocumentModel.DocumentTrees.DocumentPropertyWrapper.get_Value() at MS.Internal.Design.DocumentModel.DocumentTrees.InMemory.InMemoryDocumentProperty..ctor(DocumentProperty property, InMemoryDocumentItem item) at MS.Internal.Design.DocumentModel.DocumentTrees.InMemory.InMemoryDocumentItem.SetUpItem(DocumentItem item) Same error exist in VS2008, VS2010. Does anybody has any idea, how to deal with it so I can open window in design mode. Thanks a lot. Sincerely, Vlad.

    Read the article

  • Microsoft Sql Server 2008 R2 System Databases

    For a majority of software developers little time is spent understanding the inner workings of the database management systems (DBMS) they use to store data for their applications.  I personally place myself in this grouping. In my case, I have used various versions of Microsoft’s SQL Server (2000, 2005, and 2008 R2) and just recently learned how valuable they really are when I was preparing to deliver a lecture on "SQL Server 2008 R2, System Databases". Microsoft Sql Server 2008 R2 System DatabasesSo what are system databases in MS SQL Server, and why should I know them? Microsoft uses system databases to support the SQL Server DBMS, much like a developer uses config files or database tables to support an application. These system databases individually provide specific functionality that allows MS SQL Server to function. Name Database File Log File Master master.mdf mastlog.ldf Resource mssqlsystemresource.mdf mssqlsystemresource.ldf Model model.mdf modellog.ldf MSDB msdbdata.mdf msdblog.ldf Distribution distmdl.mdf distmdl.ldf TempDB tempdb.mdf templog.ldf Master DatabaseIf you have used MS SQL Server then you should recognize the Master database especially if you used the SQL Server Management Studio (SSMS) to connect to a user created database. MS SQL Server requires the Master database in order for DBMS to start due to the information that it stores. Examples of data stored in the Master database User Logins Linked Servers Configuration information Information on User Databases Resource DatabaseHonestly, until recently I never knew this database even existed until I started to research SQL Server system databases. The reason for this is due largely to the fact that the resource database is hidden to users. In fact, the database files are stored within the Binn folder instead of the standard MS SQL Server database folder path. This database contains all system objects that can be accessed by all other databases.  In short, this database contains all system views and store procedures that appear in all other user databases regarding system information. One of the many benefits to storing system views and store procedures in a single hidden database is the fact it improves upgrading a SQL Server database; not to mention that maintenance is decreased since only one code base has to be mainlined for all of the system views and procedures. Model DatabaseThe Model database as the name implies is the model for all new databases created by users. This allows for predefining default database objects for all new databases within a MS SQL Server instance. For example, if every database created by a user needs to have an “Audit” table when it is  created then defining the “Audit” table in the model will guarantees that the table will be located in every new database create after the model is altered. MSDB DatabaseThe MSDBdatabase is used by SQL Server Agent, SQL Server Database Mail, SQL Server Service Broker, along with SQL Server. The SQL Server Agent uses this database to store job configurations and SQL job schedules along with SQL Alerts, and Operators. In addition, this database also stores all SQL job parameters along with each job’s execution history.  Finally, this database is also used to store database backup and maintenance plans as well as details pertaining to SQL Log shipping if it is being used. Distribution DatabaseThe Distribution database is only used during replication and stores meta data and history information pertaining to the act of replication data. Furthermore, when transactional replication is used this database also stores information regarding each transaction. It is important to note that replication is not turned on by default in MS SQL Server and that the distribution database is hidden from SSMS. Tempdb DatabaseThe Tempdb as the name implies is used to store temporary data and data objects. Examples of this include temp tables and temp store procedures. It is important to note that when using this database all data and data objects are cleared from this database when SQL Server restarts. This database is also used by SQL Server when it is performing some internal operations. Typically, SQL Server uses this database for the purpose of large sort and index operations. Finally, this database is used to store row versions if row versioning or snapsot isolation transactions are being used by SQL Server. Additionally, I would love to hear from others about their experiences using system databases, tables, and objects in a real world environments.

    Read the article

  • Microsoft Sql Server 2008 R2 System Databases

    For a majority of software developers little time is spent understanding the inner workings of the database management systems (DBMS) they use to store data for their applications.  I personally place myself in this grouping. In my case, I have used various versions of Microsoft’s SQL Server (2000, 2005, and 2008 R2) and just recently learned how valuable they really are when I was preparing to deliver a lecture on "SQL Server 2008 R2, System Databases". Microsoft Sql Server 2008 R2 System DatabasesSo what are system databases in MS SQL Server, and why should I know them? Microsoft uses system databases to support the SQL Server DBMS, much like a developer uses config files or database tables to support an application. These system databases individually provide specific functionality that allows MS SQL Server to function. Name Database File Log File Master master.mdf mastlog.ldf Resource mssqlsystemresource.mdf mssqlsystemresource.ldf Model model.mdf modellog.ldf MSDB msdbdata.mdf msdblog.ldf Distribution distmdl.mdf distmdl.ldf TempDB tempdb.mdf templog.ldf Master DatabaseIf you have used MS SQL Server then you should recognize the Master database especially if you used the SQL Server Management Studio (SSMS) to connect to a user created database. MS SQL Server requires the Master database in order for DBMS to start due to the information that it stores. Examples of data stored in the Master database User Logins Linked Servers Configuration information Information on User Databases Resource DatabaseHonestly, until recently I never knew this database even existed until I started to research SQL Server system databases. The reason for this is due largely to the fact that the resource database is hidden to users. In fact, the database files are stored within the Binn folder instead of the standard MS SQL Server database folder path. This database contains all system objects that can be accessed by all other databases.  In short, this database contains all system views and store procedures that appear in all other user databases regarding system information. One of the many benefits to storing system views and store procedures in a single hidden database is the fact it improves upgrading a SQL Server database; not to mention that maintenance is decreased since only one code base has to be mainlined for all of the system views and procedures. Model DatabaseThe Model database as the name implies is the model for all new databases created by users. This allows for predefining default database objects for all new databases within a MS SQL Server instance. For example, if every database created by a user needs to have an “Audit” table when it is  created then defining the “Audit” table in the model will guarantees that the table will be located in every new database create after the model is altered. MSDB DatabaseThe MSDBdatabase is used by SQL Server Agent, SQL Server Database Mail, SQL Server Service Broker, along with SQL Server. The SQL Server Agent uses this database to store job configurations and SQL job schedules along with SQL Alerts, and Operators. In addition, this database also stores all SQL job parameters along with each job’s execution history.  Finally, this database is also used to store database backup and maintenance plans as well as details pertaining to SQL Log shipping if it is being used. Distribution DatabaseThe Distribution database is only used during replication and stores meta data and history information pertaining to the act of replication data. Furthermore, when transactional replication is used this database also stores information regarding each transaction. It is important to note that replication is not turned on by default in MS SQL Server and that the distribution database is hidden from SSMS. Tempdb DatabaseThe Tempdb as the name implies is used to store temporary data and data objects. Examples of this include temp tables and temp store procedures. It is important to note that when using this database all data and data objects are cleared from this database when SQL Server restarts. This database is also used by SQL Server when it is performing some internal operations. Typically, SQL Server uses this database for the purpose of large sort and index operations. Finally, this database is used to store row versions if row versioning or snapsot isolation transactions are being used by SQL Server. Additionally, I would love to hear from others about their experiences using system databases, tables, and objects in a real world environments.

    Read the article

  • Parameter is not valid when getting image from stream

    - by duka1
    Hi guys, I have this code: MemoryStream ms = new MemoryStream(newbytes, 0, newbytes.Length); ms.Position = 0; ms.Write(newbytes, 0, newbytes.Length); Image img = Image.FromStream(ms); img.Save(@"C:\Users\gsira\Pictures\Blue hills5.jpg"); I get this error at the Image.FromStream(ms) call: System.ArgumentException: Parameter is not valid. at System.Drawing.Image.FromStream(Stream stream, Boolean useEmbeddedColorManagement, Boolean validateIma How can I resolve this? A couple of links which solve this problem (one on an MSDN thread) are broken so I am lost.

    Read the article

  • Converting Openfire IM datetime values in SQL Server to / from VARCHAR(15) and DATETIME data types

    - by Brian Biales
    A client is using Openfire IM for their users, and would like some custom queries to audit user conversations (which are stored by Openfire in tables in the SQL Server database). Because Openfire supports multiple database servers and multiple platforms, the designers chose to store all date/time stamps in the database as 15 character strings, which get converted to Java Date objects in their code (Openfire is written in Java).  I did some digging around, and, so I don't forget and in case someone else will find this useful, I will put the simple algorithms here for converting back and forth between SQL DATETIME and the Java string representation. The Java string representation is the number of milliseconds since 1/1/1970.  SQL Server's DATETIME is actually represented as a float, the value being the number of days since 1/1/1900, the portion after the decimal point representing the hours/minutes/seconds/milliseconds... as a fractional part of a day.  Try this and you will see this is true:     SELECT CAST(0 AS DATETIME) and you will see it returns the date 1/1/1900. The difference in days between SQL Server's 0 date of 1/1/1900 and the Java representation's 0 date of 1/1/1970 is found easily using the following SQL:   SELECT DATEDIFF(D, '1900-01-01', '1970-01-01') which returns 25567.  There are 25567 days between these dates. So to convert from the Java string to SQL Server's date time, we need to convert the number of milliseconds to a floating point representation of the number of days since 1/1/1970, then add the 25567 to change this to the number of days since 1/1/1900.  To convert to days, you need to divide the number by 1000 ms/s, then by  60 seconds/minute, then by 60 minutes/hour, then by 24 hours/day.  Or simply divide by 1000*60*60*24, or 86400000.   So, to summarize, we need to cast this string as a float, divide by 86400000 milliseconds/day, then add 25567 days, and cast the resulting value to a DateTime.  Here is an example:   DECLARE @tmp as VARCHAR(15)   SET @tmp = '1268231722123'   SELECT @tmp as JavaTime, CAST((CAST(@tmp AS FLOAT) / 86400000) + 25567 AS DATETIME) as SQLTime   To convert from SQL datetime back to the Java time format is not quite as simple, I found, because floats of that size do not convert nicely to strings, they end up in scientific notation using the CONVERT function or CAST function.  But I found a couple ways around that problem. You can convert a date to the number of  seconds since 1/1/1970 very easily using the DATEDIFF function, as this value fits in an Int.  If you don't need to worry about the milliseconds, simply cast this integer as a string, and then concatenate '000' at the end, essentially multiplying this number by 1000, and making it milliseconds since 1/1/1970.  If, however, you do care about the milliseconds, you will need to use DATEPART to get the milliseconds part of the date, cast this integer to a string, and then pad zeros on the left to make sure this is three digits, and concatenate these three digits to the number of seconds string above.  And finally, I discovered by casting to DECIMAL(15,0) then to VARCHAR(15), I avoid the scientific notation issue.  So here are all my examples, pick the one you like best... First, here is the simple approach if you don't care about the milliseconds:   DECLARE @tmp as VARCHAR(15)   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SET @tmp = CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15)) + '000'   SELECT @tmp as JavaTime, @dt as SQLTime If you want to keep the milliseconds:   DECLARE @tmp as VARCHAR(15)   DECLARE @dt as DATETIME   DECLARE @ms as int   SET @dt = '2010-03-10 14:35:22.123'   SET @ms as DATEPART(ms, @dt)   SET @tmp = CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15))           + RIGHT('000' + CAST(@ms AS VARCHAR(3)), 3)   SELECT @tmp as JavaTime, @dt as SQLTime Or, in one fell swoop:   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SELECT @dt as SQLTime     , CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15))           + RIGHT('000' + CAST( DATEPART(ms, @dt) AS VARCHAR(3)), 3) as JavaTime   And finally, a way to simply reverse the math used converting from Java date to SQL date. Note the parenthesis - watch out for operator precedence, you want to subtract, then multiply:   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SELECT @dt as SQLTime     , CAST(CAST((CAST(@dt as Float) - 25567.0) * 86400000.0 as DECIMAL(15,0)) as VARCHAR(15)) as JavaTime Interestingly, I found that converting to SQL Date time can lose some accuracy, when I converted the time above to Java time then converted  that back to DateTime, the number of milliseconds is 120, not 123.  As I am not interested in the milliseconds, this is ok for me.  But you may want to look into using DateTime2 in SQL Server 2008 for more accuracy.

    Read the article

  • SharePoint 2010 Sandboxed solution SPGridView

    - by Steve Clements
    If you didn’t know, you probably will soon, the SPGridView is not available in Sandboxed solutions. To be honest there doesn’t seem to be a great deal of information out there about the whys and what nots, basically its not part of the Sandbox SharePoint API. Of course the error message from SharePoint is about as useful as punch in the face… An unexpected error has been encountered in this Web Part.  Error: A Web Part or Web Form Control on this Page cannot be displayed or imported. You don't have Add and Customize Pages permissions required to perform this action …that’s if you have debug=true set, if not the classic “This webpart cannot be added” !! Love that one! but will a little digging you should find something like this… [TypeLoadException: Could not load  type Microsoft.SharePoint.WebControls.SPGridView from assembly 'Microsoft.SharePoint, Version=14.900.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c'.]   Depending on what you want to do with the SPGridView, this may not help at all.  But I’m looking to inherit the theme of the site and style it accordingly. After spending a bit of time with Chrome’s FireBug I was able to get the required CSS classes.  I created my own class inheriting from GridView (note the lack of a preceding SP!) and simply set the styles in there. Inherit from the standard GridView public class PSGridView : GridView   Set the styles in the contructor… public PSGridView() {     this.CellPadding = 2;     this.CellSpacing = 0;     this.GridLines = GridLines.None;     this.CssClass = "ms-listviewtable";     this.Attributes.Add("style", "border-bottom-style: none; border-right-style: none; width: 100%; border-collapse: collapse; border-top-style: none; border-left-style: none;");       this.HeaderStyle.CssClass = "ms-viewheadertr";          this.RowStyle.CssClass = "ms-itmhover";     this.SelectedRowStyle.CssClass = "s4-itm-selected";     this.RowStyle.Height = new Unit(25); }   Then as you cant override the Columns property setter, a custom method to add the column and set the style… public PSGridView() {     this.CellPadding = 2;     this.CellSpacing = 0;     this.GridLines = GridLines.None;     this.CssClass = "ms-listviewtable";     this.Attributes.Add("style", "border-bottom-style: none; border-right-style: none; width: 100%; border-collapse: collapse; border-top-style: none; border-left-style: none;");       this.HeaderStyle.CssClass = "ms-viewheadertr";          this.RowStyle.CssClass = "ms-itmhover";     this.SelectedRowStyle.CssClass = "s4-itm-selected";     this.RowStyle.Height = new Unit(25); }   And that should be enough to get the nicely styled SPGridView without the need for the SPGridView, but seriously….get the SPGridView in the SandBox!!!   Technorati Tags: Sharepoint 2010,SPGridView,Sandbox Solutions,Sandbox Technorati Tags: Sharepoint 2010,SPGridView,Sandbox Solutions,Sandbox

    Read the article

  • Introducing MySQL for Excel

    - by Javier Treviño
    As part of the new product initiatives of the MySQL on Windows group we released a tool that makes the task of getting data in and out of a MySQL Database very friendly and intuitive, and we paired it with one of the preferred applications for data analysis and manipulation in Windows platforms, MS Excel. Welcome to MySQL for Excel, an add-in that is installed and accessed from within the MS Excel’s Data tab offering a wizard-like interface arranged in an elegant yet simple way to help users browse MySQL Schemas, Tables, Views and Procedures and perform data operations against them using MS Excel as the vehicle to drive the data in and out MySQL Databases. One of the coolest features we had in mind designing MySQL for Excel is simplicity. MS Excel is simple and easy to work with, thus liked by many Windows users because they don’t have to be software gurus to use it.  We applied the same principle by targeting MySQL for Excel to any kind of user, so if you are already familiarized with Excel’s interface you will find yourself working with MySQL data in no time. MySQL for Excel is shipped within the MySQL Installer as one of the tools in the suite; if prerequisites are already installed (.NET Framework 4.0, Visual Studio Tools for Office 4.0 and of course MS Office), installing the add-in involves a very few clicks and no further setup to use it. Being an Excel Add-In there is no executable file involved after the installation, running MS Excel and opening the add-in from its Data tab is all that is required. MySQL for Excel automatically integrates with MySQL Workbench (if installed) to share the same connections to MySQL Server installations, that way connections are defined just once in either product saving time.  Opening the Add-In brings the Welcome Panel at the right side of the Excel main window from which connections to MySQL Servers are shown grouped by Local VS Remote connections; then users can open any of those connections by double-clicking it and entering the password of the used account.  Additionally a user can create a connection by clicking on the New Connection action label or edit connections through MySQL Workbench (if installed) by clicking on the Manage Connections action label. Once a connection is opened, the Schema Selection panel is shown, at the top of it the selected connection (connection name, hostname/IP and username). Just below, a list of schemas is displayed where User Schemas are grouped first followed by System Schemas; users can double-click any selected schema to go to the next panel or select a schema and clicking the Next > button. Users can alternatively click on the < Back button to go back to the Welcome Panel to close the current connection and open a new one; also by clicking the Create New Schema action label they can create an empty new schema. Once a schema is opened the DB Object Selection panel is shown, this is actually the place where the fun stuff happens; from here users are able to perform actions against MySQL Tables, Views and Procedures. ">The actions available here are about importing data from a MySQL Table, View or Procedure to Excel, exporting Excel data to a new MySQL Table, appending Excel data to an existing MySQL Table or editing a MySQL Table’s data by using an Excel Worksheet as a user interface to update data in any row/column, insert new rows or delete existing rows in a very easy and friendly way. More blog posts will follow describing all of these actions, so stay tuned! Remember that your feedback is very important for us, so drop us a message: · MySQL on Windows (this) Blog - https://blogs.oracle.com/MySqlOnWindows/ · Forum - http://forums.mysql.com/list.php?172 · Facebook - http://www.facebook.com/mysql Cheers!

    Read the article

  • Wireless connection works but the internet is too slow to use in Ubuntu 11.04

    - by Garrin
    The internet is so slow as to be unusable. And I'm not being picky. Even after minutes I can't get my Google home page to load. I tried installing a package through apt-get and was getting rates between 0 and a few hundred bytes/s. That's bytes, not kilobytes! Mostly 0 however (no exaggeration, it spends large amounts of time stalled). And I would go to a speed test web site of some kind but I can't since nothing will load. Briefly put, the laptop I am using was connected to two wireless networks while using Ubuntu 11.04 without any issues before this. It was also connected to a wired network without any issues. It dual boats Windows 7 which has never had any issues, not even with the current wireless network. Just to be clear, on the current wi-fi network, Windows 7 encounters no issues (speedtest.net puts the network speed at 1mb/s) but my network connection in Ubuntu 11.04 is so slow as to literally be unusable. I am unfamiliar with the router except for the fact that it boasts a Rogers logo (that's a large ISP/cable provider in Canada for those not familiar with the land of igloos and polar bears). I am far from the router and some desktop widget I use tells me the signal strength is at 58% (it seems fairly reliable and this would appear to match up with the filled bars in the network icon). I should also mention I'm just renting a room in this house so I'm not the network administrator and while I can access the 192.168.0.1 router page, the password wasn't set to 'password' so it's not much use to me. Here are a bunch of commands I ran which don't tell me a whole lot but I thought might be more instructive to the wise around here: lspci (just showing my network card): 05:00.0 Network controller: Atheros Communications Inc. AR928X Wireless Network Adapter (PCI-Express) (rev 01) This one is self explanatory. PING www.googele.com (216.65.41.185) 56(84) bytes of data. 64 bytes from nnw.net (216.65.41.185): icmp_req=1 ttl=51 time=267 ms 64 bytes from nnw.net (216.65.41.185): icmp_req=2 ttl=51 time=190 ms 64 bytes from nnw.net (216.65.41.185): icmp_req=3 ttl=51 time=212 ms 64 bytes from nnw.net (216.65.41.185): icmp_req=4 ttl=51 time=207 ms 64 bytes from nnw.net (216.65.41.185): icmp_req=5 ttl=51 time=220 ms --- www.googele.com ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 4003ms rtt min/avg/max/mdev = 190.079/219.699/267.963/26.121 ms ifconfig eth0 Link encap:Ethernet HWaddr 20:6a:8a:02:20:da UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:42 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:16 errors:0 dropped:0 overruns:0 frame:0 TX packets:16 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:960 (960.0 B) TX bytes:960 (960.0 B) wlan0 Link encap:Ethernet HWaddr 20:7c:8f:05:c6:bf inet addr:192.168.0.16 Bcast:192.168.0.255 Mask:255.255.255.0 inet6 addr: fe80::227c:8fff:fe05:c6bf/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:982 errors:0 dropped:0 overruns:0 frame:0 TX packets:658 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:497250 (497.2 KB) TX bytes:95076 (95.0 KB) Thank you

    Read the article

  • Understanding WordProcessingML tags and avoid unnecessary tags

    - by rithanyalaxmi
    Hi, I am using MS Word API to generate .docx which contains the data fetched from DB, in which i am applying the respective styles, fonts, symbols, etc. If the data fetched from the DB is quite huge, then there is a problem in displaying those data in the .docx file. I found that internally MS Word 2007 will write some content through tags which may not be needed to display the data. Hence i am figuring out what are the necessary MS Word tags needed when converting into a .xml file. So that i can avoid unnecessary tags and build only the respective tags which are needed to display the data. Hence i am planning to write my own .xml with the MS Word tags which are needed, than generating a .XML from .docx file My queries are:- 1) Whether it is right that the MS Word will generate some tags which may not be needed during the conversion of .docx to document.xml? That makes it heavy? If so what are the tags , so that i can avoid them when write by own .xml file. 2) Please send links to understand about the MS Word tags and its advantages, which tags are needed and which are not ? 3) Whether my approach to write a new .xml similar to document.xml (.docx conversion) is worthy one to go forward so that i can build the .xml with the tags i needed , so that i can improve the performance of the data display? Please shed some light into it and thanks in advance.. Thanks, Rithu

    Read the article

  • System.Threading.Timer keep reference to it.

    - by Daniel Bryars
    According to [http://msdn.microsoft.com/en-us/library/system.threading.timer.aspx][1] you need to keep a reference to a System.Threading.Timer to prevent it from being disposed. I've got a method like this: private void Delay(Action action, Int32 ms) { if (ms <= 0) { action(); } System.Threading.Timer timer = new System.Threading.Timer( (o) => action(), null, ms, System.Threading.Timeout.Infinite); } Which I don't think keeps a reference to the timer, I've not seen any problems so far, but that's probably because the delay periods used have been pretty small. Is the code above wrong? And if it is, how to I keep a reference to the Timer? I'm thinking something like this might work: class timerstate { internal volatile System.Threading.Timer Timer; }; private void Delay2(Action action, Int32 ms) { if (ms <= 0) { action(); } timerstate state = new timerstate(); lock (state) { state.Timer = new System.Threading.Timer( (o) => { lock (o) { action(); ((timerstate)o).Timer.Dispose(); } }, state, ms, System.Threading.Timeout.Infinite); } The locking business is so I can get the timer into the timerstate class before the delegate gets invoked. It all looks a little clunky to me. Perhaps I should regard the chance of the timer firing before it's finished constructing and assigned to the property in the timerstace instance as negligible and leave the locking out.

    Read the article

  • Saving and retrieving image in SQL database from C# problem

    - by Mobin
    I used this code for inserting records in a person table in my DB System.IO.MemoryStream ms = new System.IO.MemoryStream(); Image img = Image_Box.Image; img.Save(ms, System.Drawing.Imaging.ImageFormat.Bmp); this.personTableAdapter1.Insert(NIC_Box.Text.Trim(), Application_Box.Text.Trim(), Name_Box.Text.Trim(), Father_Name_Box.Text.Trim(), DOB_TimePicker.Value.Date, Address_Box.Text.Trim(), City_Box.Text.Trim(), Country_Box.Text.Trim(), ms.GetBuffer()); but when i retrieve this with this code byte[] image = (byte[])Person_On_Application.Rows[0][8]; MemoryStream Stream = new MemoryStream(); Stream.Write(image, 0, image.Length); Bitmap Display_Picture = new Bitmap(Stream); Image_Box.Image = Display_Picture; it works perfectly fine but if i update this with my own generated Query like System.IO.MemoryStream ms = new System.IO.MemoryStream(); Image img = Image_Box.Image; img.Save(ms, System.Drawing.Imaging.ImageFormat.Bmp); Query = "UPDATE Person SET Person_Image ='"+ms.GetBuffer()+"' WHERE (Person_NIC = '"+NIC_Box.Text.Trim()+"')"; the next time i use the same code for retrieving the image and displaying it as used above . Program throws an exception

    Read the article

  • General String Encryption in .NET

    - by cryptospin
    I am looking for a general string encryption class in .NET. (Not to be confused with the 'SecureString' class.) I have started to come up with my own class, but thought there must be a .NET class that already allows you to encrypt/decrypt strings of any encoding with any Cryptographic Service Provider. Public Class SecureString Private key() As Byte Private iv() As Byte Private m_SecureString As String Public ReadOnly Property Encrypted() As String Get Return m_SecureString End Get End Property Public ReadOnly Property Decrypted() As String Get Return Decrypt(m_SecureString) End Get End Property Public Sub New(ByVal StringToSecure As String) If StringToSecure Is Nothing Then StringToSecure = "" m_SecureString = Encrypt(StringToSecure) End Sub Private Function Encrypt(ByVal StringToEncrypt As String) As String Dim result As String = "" Dim bytes() As Byte = Text.Encoding.UTF8.GetBytes(StringToEncrypt) Using provider As New AesCryptoServiceProvider() With provider .Mode = CipherMode.CBC .GenerateKey() .GenerateIV() key = .Key iv = .IV End With Using ms As New IO.MemoryStream Using cs As New CryptoStream(ms, provider.CreateEncryptor(), CryptoStreamMode.Write) cs.Write(bytes, 0, bytes.Length) cs.FlushFinalBlock() End Using result = Convert.ToBase64String(ms.ToArray()) End Using End Using Return result End Function Private Function Decrypt(ByVal StringToDecrypt As String) As String Dim result As String = "" Dim bytes() As Byte = Convert.FromBase64String(StringToDecrypt) Using provider As New AesCryptoServiceProvider() Using ms As New IO.MemoryStream Using cs As New CryptoStream(ms, provider.CreateDecryptor(key, iv), CryptoStreamMode.Write) cs.Write(bytes, 0, bytes.Length) cs.FlushFinalBlock() End Using result = Text.Encoding.UTF8.GetString(ms.ToArray()) End Using End Using Return result End Function End Class

    Read the article

  • Silverlight WinDg Memory Release Issue

    - by Chris Newton
    Hi, I have used WinDbg succesfully on a number of occasions to track down and fix memory leaks (or more accurately the CLRs inability to garbage collect a released object), but am stuck with one particular control. The control is displayed within a child window and when the window is closed a reference to the control remains and cannot be garbage collected. I have resolved what I believe to be the majority of the issues that could have caused the leak, but the !gcroot of the affected object is not clear (to me at least) as to what is still holding on to this object. The ouput is always the same regardless of the content being presented in the child window: DOMAIN(03FB7238):HANDLE(Pinned):79b12f8:Root: 06704260(System.Object[])- 05719f00(System.Collections.Generic.Dictionary2[[System.IntPtr, mscorlib],[System.Object, mscorlib]])-> 067c1310(System.Collections.Generic.Dictionary2+Entry[[System.IntPtr, mscorlib],[System.Object, mscorlib]][])- 064d42b0(System.Windows.Controls.Grid)- 064d4314(System.Collections.Generic.Dictionary2[[MS.Internal.IManagedPeerBase, System.Windows],[System.Object, mscorlib]])-> 064d4360(System.Collections.Generic.Dictionary2+Entry[[MS.Internal.IManagedPeerBase, System.Windows],[System.Object, mscorlib]][])- 064d3860(System.Windows.Controls.Border)- 064d4218(System.Collections.Generic.Dictionary2[[MS.Internal.IManagedPeerBase, System.Windows],[System.Object, mscorlib]])-> 064d4264(System.Collections.Generic.Dictionary2+Entry[[MS.Internal.IManagedPeerBase, System.Windows],[System.Object, mscorlib]][])- 064d3bfc(System.Windows.Controls.ContentPresenter)- 064d3d64(System.Collections.Generic.Dictionary2[[MS.Internal.IManagedPeerBase, System.Windows],[System.Object, mscorlib]])-> 064d3db0(System.Collections.Generic.Dictionary2+Entry[[MS.Internal.IManagedPeerBase, System.Windows],[System.Object, mscorlib]][])- 064d3dec(System.Collections.Generic.Dictionary2[[System.UInt32, mscorlib],[System.Windows.DependencyObject, System.Windows]])-> 064d3e38(System.Collections.Generic.Dictionary2+Entry[[System.UInt32, mscorlib],[System.Windows.DependencyObject, System.Windows]][])- 06490b04(Insurer.Analytics.SharedResources.Controls.HistoricalKPIViewerControl) If anyone has any ideas about what could potentially be the problem, or if you require more information, please let me know. Kind Regards, Chris

    Read the article

  • How can I create an Image in GDI+ from a Base64-Encoded string in C++?

    - by Schnapple
    I have an application, currently written in C#, which can take a Base64-encoded string and turn it into an Image (a TIFF image in this case), and vice versa. In C# this is actually pretty simple. private byte[] ImageToByteArray(Image img) { MemoryStream ms = new MemoryStream(); img.Save(ms, System.Drawing.Imaging.ImageFormat.Tiff); return ms.ToArray(); } private Image byteArrayToImage(byte[] byteArrayIn) { MemoryStream ms = new MemoryStream(byteArrayIn); BinaryWriter bw = new BinaryWriter(ms); bw.Write(byteArrayIn); Image returnImage = Image.FromStream(ms, true, false); return returnImage; } // Convert Image into string byte[] imagebytes = ImageToByteArray(anImage); string Base64EncodedStringImage = Convert.ToBase64String(imagebytes); // Convert string into Image byte[] imagebytes = Convert.FromBase64String(Base64EncodedStringImage); Image anImage = byteArrayToImage(imagebytes); (and, now that I'm looking at it, could be simplified even further) I now have a business need to do this in C++. I'm using GDI+ to draw the graphics (Windows only so far) and I already have code to decode the string in C++ (to another string). What I'm stumbling on, however, is getting the information into an Image object in GDI+. At this point I figure I need either a) A way of converting that Base64-decoded string into an IStream to feed to the Image object's FromStream function b) A way to convert the Base64-encoded string into an IStream to feed to the Image object's FromStream function (so, different code than I'm currently using) c) Some completely different way I'm not thinking of here. My C++ skills are very rusty and I'm also spoiled by the managed .NET platform, so if I'm attacking this all wrong I'm open to suggestions.

    Read the article

  • Mono & DeflateStream

    - by ILya
    I have a simple code byte[] buffer = Encoding.UTF8.GetBytes("abracadabra"); MemoryStream ms = new MemoryStream(); DeflateStream ds = new DeflateStream(ms, CompressionMode.Compress, false); ms.Write(buffer, 0, buffer.Length); DeflateStream ds2 = new DeflateStream(ms, CompressionMode.Decompress, false); byte[] buffer2 = new byte[ms.Length]; ds2.Read(buffer2, 0, (int)ms.Length); Console.WriteLine(Encoding.UTF8.GetString(buffer2)); And when reading from ds2, i have the following: Stacktrace: at (wrapper managed-to-native) System.IO.Compression.DeflateStream.ReadZStream (intptr,intptr,int) <0x00004 at (wrapper managed-to-native) System.IO.Compression.DeflateStream.ReadZStream (intptr,intptr,int) <0x00004 at System.IO.Compression.DeflateStream.ReadInternal (byte[],int,int) [0x00031] in C:\cygwin\tmp\monobuild\build\BUILD\mono-2.6.3\mcs\class\System\System.IO.Compression\DeflateStream.cs:192 at System.IO.Compression.DeflateStream.Read (byte[],int,int) [0x00086] in C:\cygwin\tmp\monobuild\build\BUILD\mono-2.6.3\mcs\class\System\System.IO.Compression\DeflateStream.cs:214 at testtesttest.MainClass.Main (string[]) [0x00041] in C:\Users\ilukyanov\Desktop\Cassini\GZipDemo\Main.cs:27 at (wrapper runtime-invoke) .runtime_invoke_void_object (object,intptr,intptr,intptr) This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information. This problem appears in Mono 2.6.1 & 2.6.3... Is there any known way to successfully read from DeflateStream in Mono? Or maybe there are some third-party open-source assemblies with the same functionality?

    Read the article

< Previous Page | 85 86 87 88 89 90 91 92 93 94 95 96  | Next Page >