Search Results

Search found 3465 results on 139 pages for 'msdn magazine'.

Page 106/139 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • Translate jQuery UI Datepicker format to .Net Date format

    - by Michael Freidgeim
    I needed to use the same date format in client jQuery UI Datepicker and server ASP.NET code. The actual format can be different for different localization cultures.I decided to translate Datepicker format to .Net Date format similar as it was asked to do opposite operation in http://stackoverflow.com/questions/8531247/jquery-datepickers-dateformat-how-to-integrate-with-net-current-culture-date Note that replace command need to replace whole words and order of calls is importantFunction that does opposite operation (translate  .Net Date format toDatepicker format) is described in http://www.codeproject.com/Articles/62031/JQueryUI-Datepicker-in-ASP-NET-MVC /// <summary> /// Uses regex '\b' as suggested in //http://stackoverflow.com/questions/6143642/way-to-have-string-replace-only-hit-whole-words /// </summary> /// <param name="original"></param> /// <param name="wordToFind"></param> /// <param name="replacement"></param> /// <param name="regexOptions"></param> /// <returns></returns> static public string ReplaceWholeWord(this string original, string wordToFind, string replacement, RegexOptions regexOptions = RegexOptions.None) { string pattern = String.Format(@"\b{0}\b", wordToFind); string ret=Regex.Replace(original, pattern, replacement, regexOptions); return ret; } /// <summary> /// E.g "DD, d MM, yy" to ,"dddd, d MMMM, yyyy" /// </summary> /// <param name="datePickerFormat"></param> /// <returns></returns> /// <remarks> /// Idea to replace from http://stackoverflow.com/questions/8531247/jquery-datepickers-dateformat-how-to-integrate-with-net-current-culture-date ///From http://docs.jquery.com/UI/Datepicker/$.datepicker.formatDate to http://msdn.microsoft.com/en-us/library/8kb3ddd4.aspx ///Format a date into a string value with a specified format. ///d - day of month (no leading zero) ---.Net the same ///dd - day of month (two digit) ---.Net the same ///D - day name short ---.Net "ddd" ///DD - day name long ---.Net "dddd" ///m - month of year (no leading zero) ---.Net "M" ///mm - month of year (two digit) ---.Net "MM" ///M - month name short ---.Net "MMM" ///MM - month name long ---.Net "MMMM" ///y - year (two digit) ---.Net "yy" ///yy - year (four digit) ---.Net "yyyy" /// </remarks> public static string JQueryDatePickerFormatToDotNetDateFormat(string datePickerFormat) { string sRet = datePickerFormat.ReplaceWholeWord("DD", "dddd").ReplaceWholeWord("D", "ddd"); sRet = sRet.ReplaceWholeWord("M", "MMM").ReplaceWholeWord("MM", "MMMM").ReplaceWholeWord("m", "M").ReplaceWholeWord("mm", "MM");//order is important sRet = sRet.ReplaceWholeWord("yy", "yyyy").ReplaceWholeWord("y", "yy");//order is important return sRet; }

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 21 (sys.dm_db_partition_stats)

    - by Tamarick Hill
    The sys.dm_db_partition_stats DMV returns page count and row count information for each table or index within your database. Lets have a quick look at this DMV so we can review some of the results. **NOTE: I am going to create an ‘ObjectName’ column in our result set so that we can more easily identify tables. SELECT object_name(object_id) ObjectName, * FROM sys.dm_db_partition_stats As stated above, the first column in our result set is an Object name based on the object_id column of this result set. The partition_id column refers to the partition_id of the index in question. Each index will have at least 1 unique partition_id and will have more depending on if the object has been partitioned. The index_id column relates back to the sys.indexes table and uniquely identifies an index on a given object. A value of 0 (zero) in this column would indicate the object is a HEAP and a value of 1 (one) would signify the Clustered Index. Next is the partition_number which would signify the number of the partition for a particular object_id. Since none of my tables in my result set have been partitioned, they all display 1 for the partition_number. Next we have the in_row_data_page_count which tells us the number of data pages used to store in-row data for a given index. The in_row_used_page_count is the number of pages used to store and manage the in-row data. If we look at the first row in the result set, we will see we have 700 for this column and 680 for the previous. This means that just to manage the data (not store it) is requiring 20 pages. The next column in_row_reserved_page_count is how many pages have been reserved, regardless if they are being used or not. The next 2 columns are used for storing LOB (Large Object) data which could be text, image, varchar(max), or varbinary(max) columns. The next two columns, row_overflow, represent pages used for data that exceed the 8,060 byte row size limit for the in-row data pages. The next columns used_page_count and reserved_page_count represent the sum of the in_row, lob, and row_overflow columns discussed earlier. Lastly is a row_count column which displays the number of rows that are in a particular index. This DMV is a very powerful resource for identifying page and row count information. By knowing the page counts for indexes within your database, you are able to easily calculate the size of indexes. For more information on this DMV, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms187737.aspx Follow me on Twitter @PrimeTimeDBA

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 24 (sys.dm_db_index_operational_stats)

    - by Tamarick Hill
    The sys.dm_db_index_operational_stats Dynamic Management Function returns information about the IO, locking, and access methods for the indexes that you currently have on your SQL Server Instance. This function takes four input parameters which are (1) database_id, (2) object_id, (3) index_id, and (4) partition_number. Let’s have a look at the results from this function against our AdventureWorks2012 database. This function returns a ton of columns, so not only will I not attempt to describe each of the columns, I wont even attempt to display all of them here. My query below will give you a subset of the columns returned from this function. SELECT database_id, object_id, index_id, partition_number, leaf_insert_count, leaf_delete_count, leaf_update_count, leaf_ghost_count, nonleaf_insert_count, nonleaf_delete_count, nonleaf_update_count, range_scan_count, forwarded_fetch_count, row_lock_count, row_lock_wait_count, page_lock_count, page_lock_wait_count, Index_lock_promotion_attempt_count, index_lock_promotion_count, page_compression_attempt_count, page_compression_success_count FROM sys.dm_db_index_operational_stats(db_id('AdventureWorks2012'), NULL, NULL, NULL) The first four columns in the result set represent the values that we passed in as our input parameters. If you use NULL’s as I did, then you will see results for every index on your system. I specified a database_id so my result set only shows those records pertaining to my AdventureWorks2012 database. The next columns in the result set provide you with information on how may inserts, deletes, or updates that have taken place on your leaf and nonleaf index levels. The nonleaf levels would refer to the intermediate and root index levels. In the middle of these you see a leaf_ghost_count column, which represents the number of records that have been logically deleted and marked as “ghosted”  and are waiting on the background ghost cleanup process to physically remove them. The range_scan_count column represents the number of range or table scans that have been performed against an index. The forwarded_fetch_count column represents the number of rows that were returned from a forwarding row pointer. The row_lock_count and row_lock_wait_count represent the number of row locks that have been requested for an index and the number of times SQL has had to wait on a row lock respectively. The page_lock_count and page_lock_wait_count represent the number of page locks that have been requested for an index and the number of times SQL has had to wait on a page lock respectively. The index_lock_promotion_attempt_count represents the number of times the database engine has attempted to promote a lock to the index level. The index_lock_promotion_count column displays how many times that index lock promotion was successful. Lastly the page_compression_attempt_count and page_compression_success_count represents how many times a page was attempted to be compressed and how many times the attempt was successful. As you can see there is a ton of information returned from this DMV. The DMV we reviewed on yesterday (sys.dm_db_index_usage_stats) provided you with good information on when and how indexes have been used, but this DMF takes an even deeper dive into these statistics. If you are interested in performing a very detailed analysis on the operational stats of your indexes, this is not only a good place to start, but more than likely the best place. For more information on this Dynamic Management Function, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms174281.aspx Follow me on Twitter @PrimeTimeDBA

    Read the article

  • Roll your own free .NET technical conference

    - by Brian Schroer
    If you can’t get to a conference, let the conference come to you! There are a ton of free recorded conference presentations online… Microsoft TechEd Let’s start with the proverbial 800 pound gorilla. Recent TechEds have recorded the majority of presentations and made them available online the next day. Check out presentations from last month’s TechEd North America 2012 or last week’s TechEd Europe 2012. If you start at http://channel9.msdn.com/Events/TechEd, you can also drill down to presentations from prior years or from other regional TechEds (Australia, New Zealand, etc.) The top presentations from my “View Queue”: Damian Edwards: Microsoft ASP.NET and the Realtime Web (SignalR) Jennifer Smith: Design for Non-Designers Scott Hunter: ASP.NET Roadmap: One ASP.NET – Web Forms, MVC, Web API, and more Daniel Roth: Building HTTP Services with ASP.NET Web API Benjamin Day: Scrum Under a Waterfall NDC The Norwegian Developer Conference site has the most interesting presentations, in my opinion. You can find the videos from the June 2012 conference at that link. The 2011 and 2010 pages have a lot of presentations that are still relevant also. My View Queue Top 5: Shay Friedman: Roslyn... hmmmm... what? Hadi Hariri: Just ‘cause it’s JavaScript, doesn’t give you a license to write rubbish Paul Betts: Introduction to Rx Greg Young: How to get productive in a project in 24 hours Michael Feathers: Deep Design Lessons ØREDEV Travelling on from Norway to Sweden... I don’t know why, but the Scandinavians seem to have this conference thing figured out. ØREDEV happens each November, and you can find videos here and here. My View Queue Top 5: Marc Gravell: Web Performance Triage Robby Ingebretsen: Fonts, Form and Function: A Primer on Digital Typography Jon Skeet: Async 101 Chris Patterson: Hacking Developer Productivity Gary Short: .NET Collections Deep Dive aspConf - The Virtual ASP.NET Conference Formerly known as “mvcConf”, this one’s a little different. It’s a conference that takes place completely on the web. The next one’s happening July 17-18, and it’s not too late to register (It’s free!). Check out the recordings from February 2011 and July 2010. It’s two years old and talks about ASP.NET MVC2, but most of it is still applicable, and Jimmy Bogard’s Put Your Controllers On a Diet presentation is the most useful technical talk I have ever seen. CodeStock Videos from the 2011 edition of this Tennessee conference are available. Presentations from last month’s 2012 conference should be available soon here. I’m looking forward to watching Matt Honeycutt’s Build Your Own Application Framework with ASP.NET MVC 3. UserGroup.tv User Group.tv was founded in January of 2011 by Shawn Weisfeld, with the mission of providing User Group content online for free. You can search by date, group, speaker and category tags. My View Queue Top 5: Sergey Rathon & Ian Henehan: UI Test Automation with Selenium Rob Vettor: The Repository Pattern Latish Seghal: The .NET Ninja’s Toolbelt Amir Rajan: Get Things Done With Dynamic ASP.NET MVC Jeffrey Richter: .NET Nuggets – Houston TechFest Keynote

    Read the article

  • Download Internet Explorer 9 RTM

    - by Harish Ranganathan
    The much anticipated RTM release of Internet Explorer 9 (IE9) happened today.  IE9 preview release was first showcased at MIX 2010 and post that there were 7-8 Platform Preview releases.  Also, IE9 Beta came out in September 2010 with close to 10 million downloads within a month.  More recently, the RC version was out with much improved performance.  Today, marks the launch of IE9 RTM.  What this means is that, within an year, the IE Team has shipped the stable product, much faster than the earlier cycles for IE8 and IE7.  I wanted to clarify a few things (myths) that arise in common 1. I am already using Chrome and its faster for me, why would I need IE9 IE9 uses 100% hardware acceleration which means, you are going to get the best of performance compared to any other browser that shipped/will ship in future.  With native Windows support, IE9 will outperform all other browsers in terms of performance. 2. What about standards and security Agreed IE6 hasn’t been in the best of standards, but why would someone compare IE6 which was released almost 10 years back.  Later, we shipped IE7 and IE8 which had the best of standards and supports during their timeframes, but one would agree that standards and specifications keep getting updated and its hard to keep pace with the same for older browsers.  Example. HTML5 support is not there in IE8 but it is very much there in IE9.  IE9 supports most of the stable standards of HTML5 and its going to provide preview releases for the work-in-progress standards. 3. IE doesn’t keep in pace with other browsers Agreed! we don’t force/release updates on major versions in very short time periods.  What we do is provide Windows Update that provides security updates/patches and other critical updates for not just IE but the whole of Windows operating system 4. I am running Windows XP, what do I do? This is the trickiest part.  Windows XP isn’t the supported operating system for IE9 and there are various reasons to it.  The recommended operating system is Windows Vista and Windows 7.  In the interest of technology and its pace, we had to discontinue Windows XP both from a retail selling perspective as well as IE9 support.  But, the recent 2 years has seen PCs/Laptops only shipped with Windows Vista or Windows 7 so, it shouldn't affect them. 5. Where do I verify IE9’s performance/standard support and other information. http://samples.msdn.microsoft.com/ietestcenter/  Here below is a snapshot of one of the tests. Clearly IE9 outperforms all other browsers and will continue to outperform them in future.  You can download IE9 from www.beautyoftheweb.com Cheers!!!

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 29 (sys.dm_os_buffer_descriptors)

    - by Tamarick Hill
    The sys.dm_os_buffer_descriptors Dynamic Management View gives you a look into the data pages that are currently in your SQL Server buffer pool. Just in case you are not familiar with some of the internals to SQL Server and how the engine works, SQL Server only works with objects that are in memory (buffer pool). When an object such as a table needs to be read and it does not exist in the buffer pool, SQL Server will read (copy) the necessary data page(s) from disk into the buffer pool and cache it. Caching takes place so that it can be reused again and prevents the need of expensive physical reads. To better illustrate this DMV, lets query it against our AdventureWorks2012 database and view the result set. SELECT * FROM sys.dm_os_buffer_descriptors WHERE database_id = db_id('AdventureWorks2012') The first column returned from this result set is the database_id column which identifies the specific database for a given row. The file_id column represents the file that a particular buffer descriptor belongs to. The page_id column represents the ID for the data page within the buffer. The page_level column represents the index level of the data page. Next we have the allocation_unit_id column which identifies a unique allocation unit. An allocation unit is basically a set of data pages. The page_type column tells us exactly what type of page is in the buffer pool. From my screen shot above you see I have 3 distinct type of Pages in my buffer pool, Index, Data, and IAM pages. Index pages are pages that are used to build the Root and Intermediate levels of a B-Tree. A Data page would represent the actual leaf pages of a clustered index which contain the actual data for the table. Without getting into too much detail, an IAM page is Index Allocation Map page which track GAM (Global Allocation Map) pages which in turn track extents on your system. The row_count column details how many data rows are present on a given page. The free_space_in_bytes tells you how much of a given data page is still available, remember pages are 8K in size. The is_modified signifies whether or not a page has been changed since it has been read into memory, .ie a dirty page. The numa_node column represents the Nonuniform memory access node for the buffer. Lastly is the read_microsec column which tells you how many microseconds it took for a data page to be read (copied) into the buffer pool. This is a great DMV for use when you are tracking down a memory issue or if you just want to have a look at what type of pages are currently in your buffer pool. For more information about this DMV, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms173442.aspx Follow me on Twitter @PrimeTimeDBA

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 25 (sys.dm_db_missing_index_details)

    - by Tamarick Hill
    The sys.dm_db_missing_index_details Dynamic Management View is used to return information about missing indexes on your SQL Server instances. These indexes are ones that the optimizer has identified as indexes it would like to use but did not have. You may also see these same indexes indicated in other tools such as query execution plans or the Database tuning advisor. Let’s execute this DMV so we can review the information it provides us. I do not have any missing index information for my AdventureWorks2012 database, but for the purposes of illustrating the result set of this DMV, I will present the results from my msdb database. SELECT * FROM sys.dm_db_missing_index_details The first column presented is the index_handle which uniquely identifies a particular missing index. The next two columns represent the database_id and the object_id for the particular table in question. Next is the ‘equality_columns’ column which gives you a list of columns (comma separated) that would be beneficial to the optimizer for equality operations. By equality operation I mean for any queries that would use a filter or join condition such as WHERE A = B. The next column, ‘inequality_columns’, gives you a comma separated list of columns that would be beneficial to the optimizer for inequality operations. An inequality operation is anything other than A = B. For example, “WHERE A != B”, “WHERE A > B”, “WHERE A < B”, and “WHERE A <> B” would all qualify as inequality. Next is the ‘included_columns’ column which list all columns that would be beneficial to the optimizer for purposes of providing a covering index and preventing key/bookmark lookups. Lastly is the ‘statement’ column which lists the name of the table where the index is missing. This DMV can help you identify potential indexes that could be added to improve the performance of your system. However, I will advise you not to just take the output of this DMV and create an index for everything you see. Everything listed here should be analyzed and then tested on a Development or Test system before implementing into a Production environment. For more information on this DMV, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms345434.aspx Follow me on Twitter @PrimeTimeDBA

    Read the article

  • Using TPL and PLINQ to raise performance of feed aggregator

    - by DigiMortal
    In this posting I will show you how to use Task Parallel Library (TPL) and PLINQ features to boost performance of simple RSS-feed aggregator. I will use here only very basic .NET classes that almost every developer starts from when learning parallel programming. Of course, we will also measure how every optimization affects performance of feed aggregator. Feed aggregator Our feed aggregator works as follows: Load list of blogs Download RSS-feed Parse feed XML Add new posts to database Our feed aggregator is run by task scheduler after every 15 minutes by example. We will start our journey with serial implementation of feed aggregator. Second step is to use task parallelism and parallelize feeds downloading and parsing. And our last step is to use data parallelism to parallelize database operations. We will use Stopwatch class to measure how much time it takes for aggregator to download and insert all posts from all registered blogs. After every run we empty posts table in database. Serial aggregation Before doing parallel stuff let’s take a look at serial implementation of feed aggregator. All tasks happen one after other. internal class FeedClient {     private readonly INewsService _newsService;     private const int FeedItemContentMaxLength = 255;       public FeedClient()     {          ObjectFactory.Initialize(container =>          {              container.PullConfigurationFromAppConfig = true;          });           _newsService = ObjectFactory.GetInstance<INewsService>();     }       public void Execute()     {         var blogs = _newsService.ListPublishedBlogs();           for (var index = 0; index <blogs.Count; index++)         {              ImportFeed(blogs[index]);         }     }       private void ImportFeed(BlogDto blog)     {         if(blog == null)             return;         if (string.IsNullOrEmpty(blog.RssUrl))             return;           var uri = new Uri(blog.RssUrl);         SyndicationContentFormat feedFormat;           feedFormat = SyndicationDiscoveryUtility.SyndicationContentFormatGet(uri);           if (feedFormat == SyndicationContentFormat.Rss)             ImportRssFeed(blog);         if (feedFormat == SyndicationContentFormat.Atom)             ImportAtomFeed(blog);                 }       private void ImportRssFeed(BlogDto blog)     {         var uri = new Uri(blog.RssUrl);         var feed = RssFeed.Create(uri);           foreach (var item in feed.Channel.Items)         {             SaveRssFeedItem(item, blog.Id, blog.CreatedById);         }     }       private void ImportAtomFeed(BlogDto blog)     {         var uri = new Uri(blog.RssUrl);         var feed = AtomFeed.Create(uri);           foreach (var item in feed.Entries)         {             SaveAtomFeedEntry(item, blog.Id, blog.CreatedById);         }     } } Serial implementation of feed aggregator downloads and inserts all posts with 25.46 seconds. Task parallelism Task parallelism means that separate tasks are run in parallel. You can find out more about task parallelism from MSDN page Task Parallelism (Task Parallel Library) and Wikipedia page Task parallelism. Although finding parts of code that can run safely in parallel without synchronization issues is not easy task we are lucky this time. Feeds import and parsing is perfect candidate for parallel tasks. We can safely parallelize feeds import because importing tasks doesn’t share any resources and therefore they don’t also need any synchronization. After getting the list of blogs we iterate through the collection and start new TPL task for each blog feed aggregation. internal class FeedClient {     private readonly INewsService _newsService;     private const int FeedItemContentMaxLength = 255;       public FeedClient()     {          ObjectFactory.Initialize(container =>          {              container.PullConfigurationFromAppConfig = true;          });           _newsService = ObjectFactory.GetInstance<INewsService>();     }       public void Execute()     {         var blogs = _newsService.ListPublishedBlogs();                var tasks = new Task[blogs.Count];           for (var index = 0; index <blogs.Count; index++)         {             tasks[index] = new Task(ImportFeed, blogs[index]);             tasks[index].Start();         }           Task.WaitAll(tasks);     }       private void ImportFeed(object blogObject)     {         if(blogObject == null)             return;         var blog = (BlogDto)blogObject;         if (string.IsNullOrEmpty(blog.RssUrl))             return;           var uri = new Uri(blog.RssUrl);         SyndicationContentFormat feedFormat;           feedFormat = SyndicationDiscoveryUtility.SyndicationContentFormatGet(uri);           if (feedFormat == SyndicationContentFormat.Rss)             ImportRssFeed(blog);         if (feedFormat == SyndicationContentFormat.Atom)             ImportAtomFeed(blog);                }       private void ImportRssFeed(BlogDto blog)     {          var uri = new Uri(blog.RssUrl);          var feed = RssFeed.Create(uri);           foreach (var item in feed.Channel.Items)          {              SaveRssFeedItem(item, blog.Id, blog.CreatedById);          }     }     private void ImportAtomFeed(BlogDto blog)     {         var uri = new Uri(blog.RssUrl);         var feed = AtomFeed.Create(uri);           foreach (var item in feed.Entries)         {             SaveAtomFeedEntry(item, blog.Id, blog.CreatedById);         }     } } You should notice first signs of the power of TPL. We made only minor changes to our code to parallelize blog feeds aggregating. On my machine this modification gives some performance boost – time is now 17.57 seconds. Data parallelism There is one more way how to parallelize activities. Previous section introduced task or operation based parallelism, this section introduces data based parallelism. By MSDN page Data Parallelism (Task Parallel Library) data parallelism refers to scenario in which the same operation is performed concurrently on elements in a source collection or array. In our code we have independent collections we can process in parallel – imported feed entries. As checking for feed entry existence and inserting it if it is missing from database doesn’t affect other entries the imported feed entries collection is ideal candidate for parallelization. internal class FeedClient {     private readonly INewsService _newsService;     private const int FeedItemContentMaxLength = 255;       public FeedClient()     {          ObjectFactory.Initialize(container =>          {              container.PullConfigurationFromAppConfig = true;          });           _newsService = ObjectFactory.GetInstance<INewsService>();     }       public void Execute()     {         var blogs = _newsService.ListPublishedBlogs();                var tasks = new Task[blogs.Count];           for (var index = 0; index <blogs.Count; index++)         {             tasks[index] = new Task(ImportFeed, blogs[index]);             tasks[index].Start();         }           Task.WaitAll(tasks);     }       private void ImportFeed(object blogObject)     {         if(blogObject == null)             return;         var blog = (BlogDto)blogObject;         if (string.IsNullOrEmpty(blog.RssUrl))             return;           var uri = new Uri(blog.RssUrl);         SyndicationContentFormat feedFormat;           feedFormat = SyndicationDiscoveryUtility.SyndicationContentFormatGet(uri);           if (feedFormat == SyndicationContentFormat.Rss)             ImportRssFeed(blog);         if (feedFormat == SyndicationContentFormat.Atom)             ImportAtomFeed(blog);                }       private void ImportRssFeed(BlogDto blog)     {         var uri = new Uri(blog.RssUrl);         var feed = RssFeed.Create(uri);           feed.Channel.Items.AsParallel().ForAll(a =>         {             SaveRssFeedItem(a, blog.Id, blog.CreatedById);         });      }        private void ImportAtomFeed(BlogDto blog)      {         var uri = new Uri(blog.RssUrl);         var feed = AtomFeed.Create(uri);           feed.Entries.AsParallel().ForAll(a =>         {              SaveAtomFeedEntry(a, blog.Id, blog.CreatedById);         });      } } We did small change again and as the result we parallelized checking and saving of feed items. This change was data centric as we applied same operation to all elements in collection. On my machine I got better performance again. Time is now 11.22 seconds. Results Let’s visualize our measurement results (numbers are given in seconds). As we can see then with task parallelism feed aggregation takes about 25% less time than in original case. When adding data parallelism to task parallelism our aggregation takes about 2.3 times less time than in original case. More about TPL and PLINQ Adding parallelism to your application can be very challenging task. You have to carefully find out parts of your code where you can safely go to parallel processing and even then you have to measure the effects of parallel processing to find out if parallel code performs better. If you are not careful then troubles you will face later are worse than ones you have seen before (imagine error that occurs by average only once per 10000 code runs). Parallel programming is something that is hard to ignore. Effective programs are able to use multiple cores of processors. Using TPL you can also set degree of parallelism so your application doesn’t use all computing cores and leaves one or more of them free for host system and other processes. And there are many more things in TPL that make it easier for you to start and go on with parallel programming. In next major version all .NET languages will have built-in support for parallel programming. There will be also new language constructs that support parallel programming. Currently you can download Visual Studio Async to get some idea about what is coming. Conclusion Parallel programming is very challenging but good tools offered by Visual Studio and .NET Framework make it way easier for us. In this posting we started with feed aggregator that imports feed items on serial mode. With two steps we parallelized feed importing and entries inserting gaining 2.3 times raise in performance. Although this number is specific to my test environment it shows clearly that parallel programming may raise the performance of your application significantly.

    Read the article

  • SQL SERVER – Solution – 2 T-SQL Puzzles – Display Star and Shortest Code to Display 1

    - by pinaldave
    Earlier on this blog we had asked two puzzles. The response from all of you is nothing but Amazing. I have received 350+ responses. Many are valid and many were indeed something I had not thought about it. I strongly suggest you read all the puzzles and their answers here - trust me if you start reading the comments you will not stop till you read every single comment. Seriously trust me on it. Personally I have learned a lot from it. Let us recap the puzzles here quickly. Puzzle 1: Why following code when executed in SSMS displays result as a * (Star)? SELECT CAST(634 AS VARCHAR(2)) Puzzle 2: Write the shortest code that produces results as 1 without using any numbers in the select statement. Bonus Q: How many different Operating System (OS) NuoDB support? As I mentioned earlier the participation was nothing but Amazing. I will write about the winners and the best answers in short time. Meanwhile I will give to the point answers to above puzzles. Solution 1: When you convert character or binary expressions (char, nchar, nvarchar, varchar,binary, or varbinary) to an expression of a different data type, data can be truncated, only partially displayed, or an error is returned because the result is too short to display. Conversions to char, varchar, nchar, nvarchar, binary, and varbinary are truncated, except for the conversions shown in the following table. Reference of the text and table from MSDN. Solution 2: The shortest code to produce answer 1 : SELECT EXP($) or SELECT COS($) or SELECT DAY($) When SELECT $ it gives us the result as 0.00 and the EXP of the same is 1. I believe it is pretty neat. There were plenty other answers but this was the shortest. Another shorter answer would be PRINT EXP($) but no one has proposed that as in original Question I have explicitly mentioned SELECT in the original question. Bonus Answer: 5 OS: Windows, MacOS, Linux, Solaris, Joyent SmartOS Reference Please do read every single comment here. Do leave a comment which one do you think is the best comment out of all the comments. Meanwhile if there is a better solution and I have missed it do let me know as we still have time to correct it. I will be selecting the winner before the weekend as I am going through each and every of 350 comment. I will be selecting the best comments along with the winning comment. If our selection matches – one of you may still win something cool.  Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology Tagged: NuoDB

    Read the article

  • Silverlight Cream for May 20, 2010 -- #866

    - by Dave Campbell
    In this Issue: Mike Snow, Victor Gaudioso, Ola Karlsson, Josh Twist(-2-), Yavor Georgiev, Jeff Wilcox, and Jesse Liberty. Shoutouts: Frank LaVigne has an interesting observation on his site: The Big Take-Away from MIX10 Rishi has updated all his work including a release of nRoute to the latest bits: nRoute Samples Revisited Looks like I posted one of Erik Mork's links two days in a row :) ... that's because I meant to post this one: Silverlight Week – How to Choose a Mobile Platform Just in case you missed it (and for me to find it easy), Scott Guthrie has an excellent post up on Silverlight 4 Tools for VS 2010 and WCF RIA Services Released From SilverlightCream.com: Silverlight Tip of the Day #23 – Working with Strokes and Shapes Mike Snow's Silverlight Tip of the Day number 23 is up and about Strokes and Shapes -- as in dotted and dashed lines. New Silverlight Video Tutorial: How to Fire a Visual State based upon the value of a Boolean Variable Victor Gaudioso's latest video tutorial is up and is on selecting and firing a video state based on a boolean... project included. Simultaneously calling multiple methods on a WCF service from silverlight Ola Karlsson details a problem he had where he was calling multiple WCF services to pull all his data and had problems... turns out it was a blocking call and he found the solution in the forums and details it all out for us... actually, a search at SilverlightCream.com would have found one of the better posts listed once you knew the problem :) Securing Your Silverlight Applications Josh Twist has an article in MSDN on Silverlight Security. He talks about Windows, forms, and .NET authorization then WCF, WCF Data, cross domain and XAP files. He also has some good external links. Template/View selection with MEF in Silverlight Josh Twist points out that this next article is just a simple demonstration, but he's discussing, and provides code for, a MEF-driven ViewModel navigation scheme with animation on the navigation. Workaround for accessing some ASMX services from Silverlight 4 Are you having problems hitting you asmx web service with Silverlight 4? Yeah... others are too! Yavor Georgiev at the Silverlight Web Services Team blog has a post up about it... why it's a sometimes problem and a workaround for it. Using Silverlight 4 features to create a Zune-like context menu Jeff Wilcox used Silverlight 4 and the Toolkit to create some samples of menus, then demonstrates a duplication of the Zune menu. You Already Are A Windows Phone 7 Programmer Jesse Liberty is demonstrating the fact that Silverlight developers are WP7 developers by creating a Silverlight and a WP7 app side by side using the same code... this is a closer look at the Silverlight TV presentation he did. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • SQLOS and Cloud Infrastructure sessions at PASS Summit 2012

    - by SQLOS Team
    The SQL Pass Summit 2012, the largest yet, is in full swing. Here's a summary of the sessions this week on cloud infrastructure and SQLOS topics. Some of these were today, and you can catch the recordings. One more session takes place on Friday covering SQL Server solution patterns in Windows Azure VMs... Also, catch Thursday's keynote with Quentin Clark which will feature a cool IaaS demo!   SQL Server in Windows Azure VM Sessions CLD-309-A SQLCAT: Best Practices and Lessons Learned on SQL Server in an Azure VM Steve Howard, Arvind Ranasaria - Wednesday 11/6 10:15 This session looked at some best practices to optimize Networking, Memory, Disk IO and high availability based on lessons learned during SQLCat work with customer deployments. Well worth catching the recording.   SQL Server in Azure VM patterns: Hybrid Disaster Recovery, data movement and BI Guy Bowerman, Peter Saddow, Michael Washam, Ross LoForte - Friday 11/9 9:45 Rm 613 [Note: In the guides this has an outdated title.] This session has a focus on SQL Server Azure VM solutions. Starting with the basics and then going deeper into: - New features in the Microsoft Assessment and Planning Toolkit 8.0 to help plan and size SQL VM migrations.- A Look at a Windows Azure VM SQL Server app making use of load balancing and SQL Server high availability features.- A BI case study running SQL BI components in Azure VMs and making use of Windows 8 tiles.- A training class in a VM case study.   SQLOS Sessions DBA-500-HD Inside SQLOS 2012 (half-day session) Bob Ward - Wednesday 11/6 1:30pm Bob Ward from CSS applies his wealth of experience to look at the internals of SQLOS and what's changed in the various SQL 2012 components, including memory, resource governor, scheduler.   DBA-403-M: SQLCAT: Memory Manager Changes in SQL Server 2012 Gus Apostol, Jerome Halmans - 1:30pm Covers the redesigned SQLOS memory manager in SQL Server 2012 including the new page allocator for any size pages (and all that implies), DMVs, demo's. Not sure why this was placed at the same time as the SQLOS half-day session, but since it's recorded it's available for catch-up.   - Guy   Originally posted at http://blogs.msdn.com/b/sqlosteam/

    Read the article

  • ASP.Net 4.5 Garbage Collection Improvement

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/06/24/asp.net-4.5-garbage-collection-improvement.aspxI just read Five Great .NET Framework 4.5 Features on CodeProject by Shivprasad koirala. Feature 5 in his article mentions the GC background cleanup and has a good explanation of the work the GC has to do for ASP.Net on the server. “Garbage collector is one real heavy task in a .NET application. And it becomes heavier when it is an ASP.NET application. ASP.NET applications run on the server and a lot of clients send requests to the server thus creating loads of objects, making the GC really work hard for cleaning up unwanted objects.” “To overcome the above problem, server GC was introduced. In server GC there is one more thread created which runs in the background. This thread works in the background and keeps cleaning…objects thus minimizing the load on the main GC thread. Due to double GC threads running, the main application threads are less suspended, thus increasing application throughput. To enable server GC, we need to use the gcServer XML tag and enable it to true.” <configuration> <runtime> <gcServer enabled="true"/> </runtime> </configuration> This is not done by default. The MSDN information page says “There are only two garbage collection options, workstation or server. For single-processor computers, the default workstation garbage collection should be the fastest option. Either workstation or server can be used for two-processor computers. Server garbage collection should be the fastest option for more than two processors. Use the GCSettingsIsServerGC property to determine if server garbage collection is enabled.” “In the .NET Framework 4 and earlier versions, concurrent garbage collection is not available when server garbage collection is enabled. Starting with the .NET Framework 4.5, server garbage collection is concurrent. To use non-concurrent server garbage collection, set the <gcServer> element to true and the <gcConcurrent> element to false. “ So if you’re using ASP.Net 4.5 and have a multi-core server, you should try turning on the Server Garbage Collection and do some profiling to see if it improves the performance of your site.

    Read the article

  • Using commands with ApplicationBarMenuItem and ApplicationBarButton in Windows Phone 7

    - by Laurent Bugnion
    Unfortunately, in the current version of the Windows Phone 7 Silverlight framework, it is not possible to attach any command on the ApplicationBarMenuItem and ApplicationBarButton controls. These two controls appear in the Application Bar, for example with the following markup: <phoneNavigation:PhoneApplicationPage.ApplicationBar> <shell:ApplicationBar x:Name="MainPageApplicationBar"> <shell:ApplicationBar.MenuItems> <shell:ApplicationBarMenuItem Text="Add City" /> <shell:ApplicationBarMenuItem Text="Add Country" /> </shell:ApplicationBar.MenuItems> <shell:ApplicationBar.Buttons> <shell:ApplicationBarIconButton IconUri="/Resources/appbar.feature.video.rest.png" /> <shell:ApplicationBarIconButton IconUri="/Resources/appbar.feature.settings.rest.png" /> <shell:ApplicationBarIconButton IconUri="/Resources/appbar.refresh.rest.png" /> </shell:ApplicationBar.Buttons> </shell:ApplicationBar> </phoneNavigation:PhoneApplicationPage.ApplicationBar> This code will create the following UI: Application bar, collapsed Application bar, expanded ApplicationBarItems are not, however, controls. A quick look in MSDN shows the following hierarchy for ApplicationBarMenuItem, for example: Unfortunately, this prevents all the mechanisms that are normally used to attach a Command (for example a RelayCommand) to a control. For example, the attached behavior present in the class ButtonBaseExtension (from the Silverlight 3 version of the MVVM Light toolkit) can only be attached to a DependencyObject. Similarly, Blend behaviors (such as EventToCommand from the toolkit’s Extras library) needs a FrameworkElement to work. Using code behind The alternative is to use code behind. As I said in my MIX10 talk, the MVVM police will not take your family away if you use code behind (this quote was actually suggested to me by Glenn Block); the code behind is there for a reason. In our case, invoking a command in the ViewModel requires the following code: In MainPage.xaml: <shell:ApplicationBarMenuItem Text="My Menu 1" Click="ApplicationBarMenuItemClick"/> In MainPage.xaml.cs private void ApplicationBarMenuItemClick( object sender, System.EventArgs e) { var vm = DataContext as MainViewModel; if (vm != null) { vm.MyCommand.Execute(null); } } Conclusion Resorting to code behind to bridge the gap between the View and the ViewModel is less elegant than using attached behaviors, either through an attached property or through a Blend behavior. It does, however, work fine. I don’t have any information if future changes in the Windows Phone 7 Application Bar API will make this easier. In the mean time, I would recommend using code behind instead.   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Silverlight Cream for December 26, 2010 -- #1015

    - by Dave Campbell
    In this all-submittal Issue: Michael Washington(-2-), Ian T. Lackey(-2-, -3-), Sandrino Di Mattia, Colin Eberhardt(-2-), and Antoni Dol. Above the Fold: Silverlight: "A Style for the Silverlight CoverFlow Control Slider" Antoni Dol WP7: "Getting the right behaviors in your Phone 7 App – Part 1 Phone Home" (and the other two parts) Ian T. Lackey Silverlight/WPF: "A Simplified Grid Markup for Silverlight and WPF" Colin Eberhardt Shoutouts: Dennis Doomen has updated his Coding Guidelines and provided a new WhitePaper, A4 cheat sheet, and VS2010 rule sets: December Update of the Coding Guidelines for C# 3.0 and C# 4.0 From SilverlightCream.com: Windows Phone 7: Saving Data when Keyboard is visible MIchael Washington takes a possible desktop approach to a data-saving issue on WP7... good solution, and one of the commenters brought up another. Windows Phone 7 View Model (MVVM) ApplicationBar Since I'm catching up, there's another post by Michael Washington... this one is looking at the WP7 ApplicationBar, and issues if you're trying to stay MVVM-proper. Michael gets around that by creating the AppBar with a behavior, and shares with all of us! Getting the right behaviors in your Phone 7 App – Part 1 Phone Home Ian T. Lackey has begun a series where he's packaging common tasks into reusable behaviors. First up is a phone dialer launching action that can be dropped on any control in your app. Getting the right behaviors in your Phone 7 App – Part 2 Binding & Browsing In his next post, Ian T. Lackey digs into the WebBrowserTask and provides a behavior allowing you to launch a browser session straight to an URL from any WP7 control. Getting the right behaviors in your Phone 7 App – Part 3 Email ‘em In his last post (all in one day), Ian T. Lackey looks at EmailComposeTask, ending up with a behavior to pre-populate EmailAddress and Subject. Cracking a Microsoft contest or why Silverlight-WCF security is important Sandrino Di Mattia was working on an app while also having a page up for a MSDN/TechNet game, and noticed some interesting WCF traffic that he was easily able to get access to. A Simplified Grid Markup for Silverlight and WPF Colin Eberhardt built us all an attached property for the Grid control that bails us out from the ugly layout we always have to put into position... oh, also for WPF! #uksnow #silverlight The Movie! – Happy Christmas Colin Eberhardt also took some time to have fun with his Twitter/BingMaps mashup for the UKSnow hashtag... you can now playback the snowfall reports, and mouse-over the snowflakes to see the original tweet... very cool stuff, Colin! A Style for the Silverlight CoverFlow Control Slider Antoni Dol got tired of the Silverlight Slider in the CoverFlow control and crafted a very nice-looking style for the Slider ... check it out and grab the source. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • XNA extending an existing Content type

    - by Maarten
    We are doing a game in XNA that reacts to music. We need to do some offline processing of the music data and therefore we need a custom type containing the Song and some additional data: // Project AudioGameLibrary namespace AudioGameLibrary { public class GameTrack { public Song Song; public string Extra; } } We've added a Content Pipeline extension: // Project GameTrackProcessor namespace GameTrackProcessor { [ContentSerializerRuntimeType("AudioGameLibrary.GameTrack, AudioGameLibrary")] public class GameTrackContent { public SongContent SongContent; public string Extra; } [ContentProcessor(DisplayName = "GameTrack Processor")] public class GameTrackProcessor : ContentProcessor<AudioContent, GameTrackContent> { public GameTrackProcessor(){} public override GameTrackContent Process(AudioContent input, ContentProcessorContext context) { return new GameTrackContent() { SongContent = new SongProcessor().Process(input, context), Extra = "Some extra data" // Here we can do our processing on 'input' }; } } } Both the Library and the Pipeline extension are added to the Game Solution and references are also added. When trying to use this extension to load "gametrack.mp3" we run into problems however: // Project AudioGame protected override void LoadContent() { AudioGameLibrary.GameTrack gameTrack = Content.Load<AudioGameLibrary.GameTrack>("gametrack"); MediaPlayer.Play(gameTrack.Song); } The error message: Error loading "gametrack". File contains Microsoft.Xna.Framework.Media.Song but trying to load as AudioGameLibrary.GameTrack. AudioGame contains references to both AudioGameLibrary and GameTrackProcessor. Are we maybe missing other references? EDIT Selecting the correct content processor helped, it loads the audio file correctly. However, when I try to process some data, e.g: public override GameTrackContent Process(AudioContent input, ContentProcessorContext context) { int count = input.Data.Count; // With this commented out it works fine return new GameTrackContent() { SongContent = new SongProcessor().Process(input, context) }; } It crashes with the following error: Managed Debugging Assistant 'PInvokeStackImbalance' has detected a problem in 'C:\Users\Maarten\Documents\Visual Studio 2010\Projects\AudioGame\DebugPipeline\bin\Debug\DebugPipeline.exe'. Additional Information: A call to PInvoke function 'Microsoft.Xna.Framework.Content.Pipeline!Microsoft.Xna.Framework.Content.Pipeline.UnsafeNativeMethods+AudioHelper::OpenAudioFile' has unbalanced the stack. This is likely because the managed PInvoke signature does not match the unmanaged target signature. Check that the calling convention and parameters of the PInvoke signature match the target unmanaged signature. Information from logger right before crash: Using "BuildContent" task from assembly "Microsoft.Xna.Framework.Content.Pipel ine, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553". Task "BuildContent" Building gametrack.mp3 -> bin\x86\Debug\Content\gametrack.xnb Rebuilding because asset is new Importing gametrack.mp3 with Microsoft.Xna.Framework.Content.Pipeline.Mp3Imp orter Im experiencing exactly this: http://forums.create.msdn.com/forums/t/75996.aspx

    Read the article

  • Delegates in c#

    - by Jalpesh P. Vadgama
    I have used delegates in my programming since C# 2.0. But I have seen there are lots of confusion going on with delegates so I have decided to blog about it. In this blog I will explain about delegate basics and use of delegates in C#. What is delegate? We can say a delegate is a type safe function pointer which holds methods reference in object. As per MSDN it's a type that references to a method. So you can assign more than one methods to delegates with same parameter and same return type. Following is syntax for the delegate public delegate int Calculate(int a, int b); Here you can see the we have defined the delegate with two int parameter and integer parameter as return parameter. Now any method that matches this parameter can be assigned to above delegates. To understand the functionality of delegates let’s take a following simple example. using System; namespace Delegates { class Program { public delegate int CalculateNumber(int a, int b); static void Main(string[] args) { int a = 5; int b = 5; CalculateNumber addNumber = new CalculateNumber(AddNumber); Console.WriteLine(addNumber(5, 6)); Console.ReadLine(); } public static int AddNumber(int a, int b) { return a + b; } } } Here in the above code you can see that I have created a object of CalculateNumber delegate and I have assigned the AddNumber static method to it. Where you can see in ‘AddNumber’ static method will just return a sum of two numbers. After that I am calling method with the help of the delegates and printing out put to the console application. Now let’s run the application and following is the output as expected. That’s it. You can see the out put of delegates after adding a number. This delegates can be used in variety of scenarios. Like in web application we can use it to update one controls properties from another control’s action. Same you can also call a delegates whens some UI interaction done like button clicked. Hope you liked it. Stay tuned for more. In next post I am going to explain about multicast delegates. Till then happy programming.

    Read the article

  • SharePoint 2010 Data Retrival Techinques

    - by Jayant Sharma
    In SharePoint, we have two options to perform CRUD operation.1. using server side code2. using client side codeusing server side code, we have 1. CAML2. LINQusing client side code, we have 1. Client Object Model    1.1.      Managed Client Object Model     1.2.     Silverlight Client Object Model    1.3.     ECMA Client Object Model2. SharePoint Web Services3. ADO Data Service (based on REST Web Services)4. Using RPC Call (owssvr.dll)Which and when these options are used depend upon requirements. Every options are certain advantages and disadvantages. So, before start development of any new sharepoint project, it is important to understand the limitations of different methods.Server Object Model is used when our application is host on the same server on which sharepoint is installed. while Client Side code is used to access sharepoint from client system. In SharePoint 2010 specially Client Object Model (COM) are introduced to perform the sharepoint operations from client system. Advantage of CAML:    -  It is fast.    -  Can be use it from all kind of technology like Silverlight, or Jquery    -  You can use U2U CAML Query builder to generate CAML Query.Disadvantage Of CAML:    - Error Prone, as we can detect the error only at runtimeAdvantage of LINQ:    -  Object Oriented technique (Object Relation Model)    -  LINQ  to SharePoint provider are working with Strongly Type List Item Objects, So intellisence are present at runtime    -  No need of knowledge of CAML    -  Less Error Prone as it as it uses C# syntex.    -  You can compare two Fields of SharePoint ListDisadvantage Of LINQ:    -  List Attachment is not supported in SPMetal Tool    -  Created By, Created, Modified and Modified By Fields are not created by SPMetal Tool.    -  Custom fields are not created by SPMetal Tools    -  External Lists are not supported    -  Though at backend LINQ genenates CAML Query so it is slower than directly using CAML in Code.  Advantage of Client Object Model    -  Used to access sharepoint from client system    -  No WebServer is required at Client End    - Can use Silverlight and JavaScripts to make better and fast User experienceDisadvantage of Client Object Model    -  You cannot use RunwithEleveatedPrivilege    - Cross Site Collection query are not possible    - Lesser API's are availableADO.Net Data Services:    -  Only List based operations are possible, other type of operations are not possible.SharePoint Web Services and RPC Call:    - Previously it was used in SharePoint 2007 but after the introduction  of Client Object Model,  Microsoft recommends not to use Web Services to fetch data from SharePoint. In SharePoint 2010 it is avaliable only for backward compatibility.Ref: http://msdn.microsoft.com/en-us/library/ee539764Jayant Sharma

    Read the article

  • K2 4.5 Quick Thoughts

    I just finished attending a webcast on K2 4.5 and I thought Id share a few quick thoughts. Power User Story Improved Given it is just a presentation and I havent actually played with it, the story seemed improved and more believable that real world power users would be able to define workflows in SharePoint.  Power users who would be comfortable with Excel functions may be able to do some more worthwhile workflows since there is new support for inline functions and conditions.  The new SilverLight K2 designer seems pretty user friendly, though the dialog windows can really stack up which may get confusing.  I thought the neatest part was that the workflow can be defined just by starting with a SharePoint Lists settings which may be okay for some organizations and simpler workflows that dont need to define the workflow and push it through lots of testing in different environments.  The standalone K2 Studio is back.  In K2 2003 it was required because Visual Studio integration didnt exist.  Its back now for use by power users who need functionality up to the point of code.  Not sure if this Administration/Installation Installation is supposed to be simplified, with unattended install and other details I didnt catch.  Install and configuration has always seemed daunting to me so anything to improve that is good.  Related to that there is a new tool that is meant to help diagnose issues in your installation.  That may include figuring out missing permissions or services that arent running.  Also, now all K2 SharePoint features deployed as solutions. Dynamic SQL Service Broker Create a smart object to go against a table that you created, NOT the SmartBox.  This seems promising and something that maybe should have been there all along. Reference Event Allows you to call functionally that youve referenced, in the sample showing it was calling a web service that was referenced.  It seemed odd because it was really like writing code using dialogs (call constructor, set timeout, call web service method).  Seemed a little odd to me. Help We were reminded that help.k2.com site is newish site that is supposed to be the MSDN of K2 for partners and customers. VS 2010 Support Still no hard date on this, but what we were told is approximately 90 days after VS 2010 is officially released.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Workaround: XNA 4 importing only part of 3d model from FBX

    - by Vitus
    Recently I found a problem with importing 3D models from FBX files: it sometimes imported partly. That is when you draw a 3D model, loaded from FBX file, processed by content pipeline, you got only part of meshes. “Sometimes” means that you got this error only for some files. Results of my research below. For example, I have 10Mb binary FBX file with a model, looks like: And when I load it, result Model instance contains only part of meshes and looks like: Because models from other files imported normally, I think that it’s a “bad format” file. When you add FBX file to your XNA Content project and build it, imported file processing by XNA Fbx Importer & Processor. On MSDN I found that FbxImporter designed to work with 2006.11 version of FBX format. My file is FBX 2012 format. Ok, I need to convert it to 2006 format. It can be done by using Autodesk FBX Converter 2012.1. I tried to convert it to other versions of FBX formats, but without success. And I also tried to import my FBX file to 3D MAX, and it imported correctly. Then I export model using 3D MAX, and it generate me other FBX, which I add to my XNA project. After that I got full model, that rendered well! So, internal data structure of FBX file is more important for right XNA import, than it version! Unfortunately, Autodesk FBX is not an open file format. If you want to work with FBX, you should use Autodesk FBX SDK. This way you can manually read content of FBX file, and use it everyway. Then I tried to convert my source FBX file to DAE Collada, and result DAE file back to FBX, using FBX Converter (FBX –> DAE –> FBX). The result FBX file can be imported normally.   Conclusion: XNA FbxImporter correct work doesn't depend on version (2006, 2011, etc) and form (binary, ascii) of FBX file. Internal FBX data structure much more important. To make FBX "readable" for XNA Importer you can use double conversion like FBX -> Collada -> FBX You also can use FBX SDK to manually load data from FBX P.S. Autodesk FBX Converter 2012 is more, than simple converter. It provide you tools like: FBX Explorer, which show you structure of FBX file; FBX Viewer, which render content of FBX and provide basic intercation like model move and zoom; FBX Take Manager, which allow to work with embedded animations

    Read the article

  • InvokeRequired not reliable?

    - by marocanu2001
    InvalidOperationException: Cross-thread operation not valid: Control 'progressBar' accessed from a thread other than the thread it was created ... Now this is not a nice way to start a day! not even if it's a Monday! So you have seen this and already thought, come on, this is an old one, just ask whether InvokeRequired before calling the method in the control, something like this: if (progressBar.InvokeRequired) {           progressBar.Invoke  ( new MethodInvoker( delegate {  progressBar.Value = 50;    }) ) }else      progressBar.Value = 50;    Unfortunately this was not working the way I would have expected, I got this error, debugged and though in debugging the InvokeRequired had become true , the error was thrown on the branch that did not required Invoke. So there was a scenario where InvokeRequired returned false and still accessing the control was not on the right thread ... Problem was that I kept showing and hiding the little form showing the progressbar. The progressbar was updating on an event  , ProgressChanged and I could not guarantee the little form was loaded by the time the event was thrown. So , spotted the problem, if none of the parents of the control you try to modify is created at the time of the method invoking, the InvokeRequired returns true! That causes your code to execute on the woring thread. Of course, updating UI before the win dow was created is not a legitimate action either, still I would have expected a different error. MSDN: "If the control's handle does not yet exist, InvokeRequired searches up the control's parent chain until it finds a control or form that does have a window handle. If no appropriate handle can be found, the InvokeRequired method returns false. This means that InvokeRequired can return false if Invoke is not required (the call occurs on the same thread), or if the control was created on a different thread but the control's handle has not yet been created." Have  a look at InvokeRequired's implementation: public bool InvokeRequired {     get     {         HandleRef hWnd;         int lpdwProcessId;         if (this.IsHandleCreated)         {             hWnd = new HandleRef(this, this.Handle);         }         else         {             Control wrapper = this.FindMarshallingControl();             if (!wrapper.IsHandleCreated)             {                 return false; // <==========             }             hWnd = new HandleRef(wrapper, wrapper.Handle);         }         int windowThreadProcessId = SafeNativeMethods.GetWindowThreadProcessId(hWnd, out lpdwProcessId);         int currentThreadId = SafeNativeMethods.GetCurrentThreadId();         return (windowThreadProcessId != currentThreadId);     } } Here 's a good article about this and a workaround http://www.ikriv.com/en/prog/info/dotnet/MysteriousHang.html

    Read the article

  • Running Teamsite User Admin tool IWUSERADM.exe from ASP.NET

    - by Narendra Tiwari
    It has really been a head scratching task for me. I 've tried many options but nothing worked. Finally I found a workaround on google to achive this by TaskScheduler. PROBLEM When we run Teamsite user administration command line tool IWUSERADM.exe though ASP.Net it gives following error: Application popup: cmd.exe - Application Error : The application failed to initialize properly (0xc0000142). Click on OK to terminate the application. CAUSE No specific cause, it seems to be a bug, supposed to be resolved with this Microsoft patch http://support.microsoft.com/kb/960266. and there is nothing related to permission issue, y web application is impersonated with an administrator account. off course running a bat file from dmin account is a potential secury threat but for this scenario lets conifned our discussion to run the command line tool. RESOLUTION I have not tried this patch as I have not permitted to run this patch on server. Below are the steps to achive the requirement. 1/ Create a batch file which runs the IWUSERADM.exe.         echo - Add Teamsite User    CD E:\Appli\GN00\iw-home\bin    iwuseradm add-user %1 2/ Temporarily create a schedule task and run  the .bat file by scheduled task by ASP.Net code using TaskScheduler http://www.codeproject.com/KB/cs/tsnewlib.aspx. 3/ Here is the function: private int AddTeamsiteUser(string strBatchFilePath, string strUser) { //Get a ScheduledTasks object for the local computer. ScheduledTasks st = new ScheduledTasks(); // Create a task Task t; try{ t = st.CreateTask("~AddTeamsiteUser"); } catch { throw new Exception("Schedule Task ~AddTeamsiteUser already exist."); }    t.SetAccountInformation(yourLogin, yourPassword); //Set the account under which the task should run.  t.Save();  t.Run(); Thread.Sleep(2000); //for sync issue //Remove the scheduled task st.DeleteTask("~AddTeamsiteUser"); return t.ExitCode;   Below are few resources related to the above scenario:- - Task Scheduler Class Library for .NET  http://www.codeproject.com/KB/cs/tsnewlib.aspx - Run a .BAT file from ASP.NET  http://codebetter.com/blogs/brendan.tompkins/archive/2004/05/13/13484.aspx - TaskScheduler Class  http://msdn.microsoft.com/en-us/library/system.threading.tasks.taskscheduler.aspx - Application Hangs whle running iwuseradm.exe through ASP.Net  http://bytes.com/topic/asp-net/answers/733098-system-diagnostics-process-hangs     t.ApplicationName = strBatchFilePath; t.Parameters = strUser; t.Comment = "Adding user to Teamsite Application"

    Read the article

  • Clone an Azure VM using Powershell

    - by jamiet
    In a few months time I will, in association with Technitrain, be running a training course entitled Introduction to SQL Server Data Tools. I am currently working on putting together some hands-on lab material for the course delegates and have decided that in order to save time in asking people to install software during the course I am simply going to prepare a virtual machine (VM) containing all the software and lab material for each delegate to use. Given that I am an MSDN subscriber it makes sense to use Windows Azure to host those VMs given that it will be close to, if not completely, free to do so. What I don’t want to do however is separately build a VM for each delegate, I would much rather build one VM and clone it for each delegate. I’ve spent a bit of time figuring out how to do this using Powershell and in this blog post I am sharing a script that will: Prompt for some information (Azure credentials, Azure subscription name, VM name, username & password, etc…) Create a VM on Azure using that information Prompt you to sysprep the VM and image it (this part can’t be done with Powershell so has to be done manually, a link to instructions is provided in the script output) Create three new VMs based on the image Remove those three VMs Simply download the script and execute it within Powershell, assuming you have an Azure account it should take about 20minutes to execute (spinning up VMs and shutting the down isn’t instantaneous). If you experience any issues please do let me know. There are additional notes below. Hope this is useful! @Jamiet  Notes: Obviously there isn’t a lot of point in creating some new VMs and then instantly deleting them. However, this demo script does provide everything you need should you want to do any of these operations in isolation. The names of the three VMs that get created will be suffixed with 001, 002, 003 but you can edit the script to call them whatever you like. The script doesn’t totally clean up after itself. If you specify a service name & storage account name that don’t already exist then it will create them however it won’t remove them when everything is complete. The created image file will also not be deleted. Removing these items can be done by visiting http://manage.windowsazure.com. When creating the image, ensure you use the correct name (the script output tells you what name to use): Here are some screenshots taken from running the script: When the third and final VM gets removed you are asked to confirm via this dialog: Select ‘Yes’

    Read the article

  • BizTalk 2009 - Custom Functoid Categories

    - by StuartBrierley
    I recently had cause to code a number of custom functoids to aid with some maps that I was writing. Once these were developed and deployed to C:\Program Files\Microsoft BizTalk Server 2009\Developer Tools\Mapper Extensions a quick refresh allowed them to appear in toolbox.  After dropping these on a map and configuring the appropriate inputs I tested the map to check that they worked as expected.  All but one of the functoids worked as expecetd, but the final functoid appeared not to be firing at all. I had already tested the code used in a simple test harness application, so I was confident in the code used, but I still needed to figure out what the problem might be. Debugging the map helped me on the way; for some reason the functoid in question was not shown correctly - the functoid definition was wrong. After some investigations I found that the functoid type you assign when coding a custom functoid affects more than just the category it appears in; different functoid types have different capabilities, including what they can link too.  For example, a logical functoid can not provide content for an output element, it can only say whether the element exists.  Map this via a Value Mapping functoid and the value of true or false can be seen in the output element. The functoid I was having problems with was one whare I had used the XPath functoid type, this had seemed to be a good fit as I was looking up content in a config file using xpath and I wanted it to appear the advanced area.  From the table below you can see that this functoid type is marked as "Internal Only", preventing it from being used for custom functoids.  Changing my type to String allowed the functoid to function as expected. Category Description Toolbox Group Assert Internal Use Only Advanced Conversion Converts characters to and from numerics and converts numbers from one base to another. Conversion Count Internal Use Only Advanced Cumulative Performs accumulations of the value of a field that occurs multiple times in a source document and outputs a single output. Cumulative DatabaseExtract Internal Use Only Database DatabaseLookup Internal Use Only Database DateTime Adds date, time, date and time, or add days to a specified date, in output data. Date/Time ExistenceLooping Internal Use Only Advanced Index Internal Use Only Advanced Iteration Internal Use Only Advanced Keymatch Internal Use Only Advanced Logical Controls conditional behavior of other functoids to determine whether particular output data is created. Logical Looping Internal Use Only Advanced MassCopy Internal Use Only Advanced Math Performs specific numeric calculations such as addition, multiplication, and division. Mathematical NilValue Internal Use Only Advanced Scientific Performs specific scientific calculations such as logarithmic, exponential, and trigonometric functions. Scientific Scripter Internal Use Only Advanced String Manipulates data strings by using well-known string functions such as concatenation, length, find, and trim. String TableExtractor Internal Use Only Advanced TableLooping Internal Use Only Advanced Unknown Internal Use Only Advanced ValueMapping Internal Use Only Advanced XPath Internal Use Only Advanced Links http://msdn.microsoft.com/en-us/library/microsoft.biztalk.basefunctoids.functoidcategory(BTS.20).aspx http://blog.eliasen.dk/CommentView,guid,d33b686b-b059-4381-a0e7-1c56e808f7f0.aspx

    Read the article

  • XNA: Networking gone totally out of sync

    - by MesserChups
    I'm creating a multiplayer interface for a game in 2D some of my friends made, and I'm stuck with a huge latency or sync problem. I started by adapting my game to the msdn xna network tutorial and right now when I join a SystemLink network session (1 host on PC and 1 client on Xbox) I can move two players, everything is ok, but few minutes later the two machines start being totally out of synchronization. When I move one player it takes 10 or 20 seconds (increasing with TIME) to take effect on the second machine. I've tried to : Create a thread which calls NetworkSession.Update() continuously as suggested on this forum, didn't worked. Call the Send() method one frame on 10, and the receive() method at each frame, didn't worked either. I've cleaned my code, flushed all buffers at each call and switched the host and client but the problem still remain... I hope you have a solution because I'm running out of ideas... Thanks SendPackets() code : protected override void SendPackets() { if ((NetworkSessionState)m_networkSession.SessionState == NetworkSessionState.Playing) //Only while playing { //Write in the packet manager m_packetWriter.Write(m_packetManager.PacketToSend.ToArray(), 0, (int)m_packetManager.PacketToSend.Position); m_packetManager.ResetPacket(); //flush //Sends the packets to all remote gamers foreach (NetworkGamer l_netGamer in m_networkSession.RemoteGamers) { if (m_packetWriter.Length != 0) { FirstLocalNetGamer.SendData(m_packetWriter, SendDataOptions.None, l_netGamer); } } m_packetWriter.Flush();//m m_packetWriter.Seek(0, 0); } } ReceivePackets() code : public override void ReceivePackets() { base.ReceivePackets(); if ((NetworkSessionState)m_networkSession.SessionState == NetworkSessionState.Playing) //Only while playing { if (m_networkSession.LocalGamers.Count > 0) //Verify that there's at least one local gamer { foreach (LocalNetworkGamer l_localGamer in m_networkSession.LocalGamers) { //every LocalNetworkGamer must read to flush their stream // Keep reading while packets are available. NetworkGamer l_oldSender = null; while (l_localGamer.IsDataAvailable) { // Read a single packet, even if we are the host, we must read to clear the queue NetworkGamer l_newSender; l_localGamer.ReceiveData(m_packetReader, out l_newSender); if (l_newSender != l_oldSender) { if ((!l_newSender.IsLocal) && (l_localGamer == FirstLocalNetGamer)) { //Parsing PacketReader to MemoryStream m_packetManager.Receive(new MemoryStream(m_packetReader.ReadBytes(m_packetReader.Length))); } } l_oldSender = l_newSender; m_packetReader.BaseStream.Flush(); m_packetReader.BaseStream.Seek(0, SeekOrigin.Begin); } } m_packetManager.ParsePackets(); } } }

    Read the article

  • How to develop "Client script library" for ASP.net controls and how do these work?

    - by Niranjan Kala
    I have been working on .Net platform for 2 years and right now I am working on DevExpress controls for 6 months. All these control have client-side Events which are under some ClientScript nameSpace of particular control, Which specify ClientInstanceName, methods and properties accessible at client side. For example Button1 is ClientInstanceName and Button1.Text is a property, with methods like these: Button1.SetValue(); Button1.GetValue(); In ASP.Net Controls, buttons have the ClientClick event that fires before the Server Side Click event. I have inspected and goggled to extend client side functionality in asp.net controls. For example: create a ClientInstanceName property for controls or CheckedChanged event for CheckBox / RadioButton control. I have tried using these MSDN articles: Injecting Client-Side Script from an ASP.NET Server Control Working with Client-Side Script I got much information and ideas from these articles on how to implement/extend these. All are working in the client side. protected override void AddAttributesToRender(HtmlTextWriter writer) { base.AddAttributesToRender(writer); string script = @"return confirm(""%%POPUP_MESSAGE%%"");"; script = script.Replace("%%POPUP_MESSAGE%%", this.PopupMessage.Replace("\"", "\\\"")); writer.AddAttribute(HtmlTextWriterAttribute.Onclick, script); } Here It is just setting up attribute to the button. but all client side interaction no control from server. Here is that I want to know: How can I implement such functionality to create methods, properties etc. on client side. For example I am creating a PopControl as in the above code snippet same behavior as like Ajax ModalPopupExtender That have OK Button related properties. Ajax Controls can be directed to perform work from server side code e.g. Popup1.show(); How can I do this with such client enabled controls implemented controls as windows do? I am learning creation of Ajax Controls but I do not want to use ScriptManager or depend on another control. Just some extension to standard controls. I am expecting for ideas and implementation methods for such functionality.

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >