Search Results

Search found 59278 results on 2372 pages for 'time estimation'.

Page 166/2372 | < Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >

  • is it possible to access/write database ms access 2003 .mdb at the same time?

    - by tintincute
    hi i have a problem, i have a user who created a database using ms access 2003 the problem is, if he's opening the db and made some changes, the other user can open the db but they can't work on it. but if he's exited the program, then the user can make some changes. i would like to know if its possible; that they can work at the same time when they open the database? Thanks I attached a .jpg here to see the program: www.freeimagehosting.net/image.php?ed11af4cc5.jpg additional jpg: http://www.freeimagehosting.net/image.php?3c60d8e046.jpg additional question: I tried to do the "Splitting of Database" here and after I clicked on Split I got an error: "The database engine couldn't lock the table, because it is already in use by another person or process"... what does that mean? Did I lock the table? www.freeimagehosting.net/image.php?fc52cfc486.jpg

    Read the article

  • Why does changing a truecrypt password take such a long time?

    - by Alex
    I am changing the password of a truecrypt file container. This takes around 1 minute. Why? time truecrypt --text --change /tmp/user1.tc --keyfiles= --new-keyfiles= --password=known --new-password=known --random-source=/dev/null" If I use strace I see that it basically does not do anything: it simply reads lots of random data from /dev/urandom (even if i specified /dev/null as random source) and finally changes the password: open("/dev/urandom", O_RDONLY) = 6 read(6, "\36&{\351\212\212\343\202\34\313\242\312I\326\235\245\224\300\354O)\270Q\200 \201J\227\224\311_\212\367"..., 640) = 640 close(6) = 0

    Read the article

  • Is it bad to put your computer in sleep mode every time?

    - by Ivo Flipse
    Often I have a lot of stuff open and don't feel like shutting down my laptop, so I just use sleep mode when I'm transferring it. But I have no idea if this might have any disadvantages. So my question is: is it bad to put your computer in sleep mode every time? Things I'm wondering: Should I turn off my computer every once in a while? Will continuous use of sleep mode slow down my system in any way? Are there any bad side effects (in the long term)? Any thoughts? FYI I'm using Windows 7 on a laptop

    Read the article

  • SQLAuthority News – TechEd India – April 12-14, 2010 Bangalore – An Unforgettable Experience – An Op

    - by pinaldave
    TechEd India was one of the largest Technology events in India led by Microsoft. This event was attended by more than 3,000 technology enthusiasts, making it one of the most well-organized events of the year. Though I attempted to attend almost all the technology events here, I have not seen any bigger or better event in Indian subcontinents other than this. There are 21 Technical Tracks at Tech·Ed India 2010 that span more than 745 learning opportunities. I was fortunate enough to be a part of this whole event as a speaker and a delegate, as well. TechEd India Speaker Badge and A Token of Lifetime Hotel Selection I presented three different sessions at TechEd India and was also a part of panel discussion. (The details of the sessions are given at the end of this blog post.) Due to extensive traveling, I stay away from my family occasionally. For this reason, I took my wife – Nupur and daughter Shaivi (8 months old) to the event along with me. We stayed at the same hotel where the event was organized so as to maximize my time bonding with my family and to have more time in networking with technology community, at the same time. The hotel Lalit Ashok is the largest and most luxurious venue one can find in Bangalore, located in the middle of the city. The cost of the hotel was a bit pricey, but looking at all the advantages, I had decided to ask for a booking there. Hotel Lalit Ashok Nupur Dave and Shaivi Dave Arrival Day – DAY 0 – April 11, 2010 I reached the event a day earlier, and that was one wise decision for I was able to relax a bit and go over my presentation for the next day’s course. I am a kind of person who likes to get everything ready ahead of time. I was also able to enjoy a pleasant evening with several Microsoft employees and my family friends. I even checked out the location where I would be doing presentations the next day. I was fortunate enough to meet Bijoy Singhal from Microsoft who helped me out with a few of the logistics issues that occured the day before. I was not aware of the fact that the very next day he was going to be “The Man” of the TechEd 2010 event. Vinod Kumar from Microsoft was really very kind as he talked to me regarding my subsequent session. He gave me some suggestions which were really helpful that I was able to incorporate them during my presentation. Finally, I was able to meet Abhishek Kant from Microsoft; his valuable suggestions and unlimited passion have inspired many people like me to work with the Community. Pradipta from Microsoft was also around, being extremely busy with logistics; however, in those busy times, he did find some good spare time to have a chat with me and the other Community leaders. I also met Harish Ranganathan and Sachin Rathi, both from Microsoft. It was so interesting to listen to both of them talking about SharePoint. I just have no words to express my overwhelmed spirit because of all these passionate young guys - Pradipta,Vinod, Bijoy, Harish, Sachin and Ahishek (of course!). Map of TechEd India 2010 Event Day 1 – April 12, 2010 From morning until night time, today was truly a very busy day for me. I had two presentations and one panel discussion for the day. Needless to say, I had a few meetings to attend as well. The day started with a keynote from S. Somaseger where he announced the launch of Visual Studio 2010. The keynote area was really eye-catching because of the very large, bigger-than- life uniform screen. This was truly one to show. The title music of the keynote was very interesting and it featured Bijoy Singhal as the model. It was interesting to talk to him afterwards, when we laughed at jokes together about his modeling assignment. TechEd India Keynote Opening Featuring Bijoy TechEd India 2010 Keynote – S. Somasegar Time: 11:15pm – 11:45pm Session 1: True Lies of SQL Server – SQL Myth Buster Following the excellent keynote, I had my very first session on the subject of SQL Server Myth Buster. At first, I was a bit nervous as right after the keynote, for this was my very first session and during my presentation I saw lots of Microsoft Product Team members. Well, it really went well and I had a really good discussion with attendees of the session. I felt that a well begin was half-done and my confidence was regained. Right after the session, I met a few of my Community friends and had meaningful discussions with them on many subjects. The abstract of the session is as follows: In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myths and their resolutions as I back them up with some demo. This demo presentation is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet fun session. Pinal Presenting session at TechEd India 2010 Time: 1:00 PM – 2:00 PM Lunch with Somasegar After the session I went to see my daughter, and then I headed right away to the lunch with S. Somasegar – the keynote speaker and senior vice president of the Developer Division at Microsoft. I really thank to Abhishek who made it possible for us. Because of his efforts, all the MVPs had the opportunity to meet such a legendary person and had to talk with them on Microsoft Technology. Though Somasegar is currently holding such a high position in Microsoft, he is very polite and a real gentleman, and how I wish that everybody in industry is like him. Believe me, if you spread love and kindness, then that is what you will receive back. As soon as lunch time was over, I ran to the session hall as my second presentation was about to start. Time: 2:30pm – 3:30pm Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Business Intelligence is a subject which was widely talked about at TechEd. Everybody was interested in this subject, and I did not excuse myself from this great concept as well. I consider myself fortunate as I was presenting on the subject of Master Data Services at TechEd. When I had initially learned this subject, I had a bit of confusion about the usage of this tool. Later on, I decided that I would tackle about how we all developers and DBAs are not able to understand something so simple such as this, and even worst, creating confusion about the technology. During system designing, it is very important to have a reference material or master lookup tables. Well, I talked about the same subject and presented the session keeping that as my center talk. The session went very well and I received lots of interesting questions. I got many compliments for talking about this subject on the real-life scenario. I really thank Rushabh Mehta (CEO, Solid Quality Mentors India) for his supportive suggestions that helped me prepare the slide deck, as well as the subject. Pinal Presenting session at TechEd India 2010 The abstract of the session is as follows: SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in-depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision-making process by allowing you to create, manage and propagate changes from a single master view of your business entities. Also, MDS – Master Data-hub which is a vital component, helps ensure the consistency of reporting across systems and deliver faster and more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Pinal Presenting session at TechEd India 2010 The day was still not over for me. I had ran into several friends but we were not able keep our enthusiasm under control about all the rumors saying that SQL Server 2008 R2 was about to be launched tomorrow in the keynote. I then ran to my third and final technical event for the day- a panel discussion with the top technologies of India. Time: 5:00pm – 6:00pm Panel Discussion: Harness the power of Web – SEO and Technical Blogging As I have delivered two technical sessions by this time, I was a bit tired but  not less enthusiastic when I had to talk about Blog and Technology. We discussed many different topics there. I told them that the most important aspect for any blog is its content. We discussed in depth the issues with plagiarism and how to avoid it. Another topic of discussion was how we technology bloggers can create awareness in the Community about what the right kind of blogging is and what morally and technically wrong acts are. A couple of questions were raised about what type of liberty a person can have in terms of writing blogs. Well, it was generically agreed that a blog is mainly a representation of our ideas and thoughts; it should not be governed by external entities. As long as one is writing what they really want to say, but not providing incorrect information or not practicing plagiarism, a blogger should be allowed to express himself. This panel discussion was supposed to be over in an hour, but the interest of the participants was remarkable and so it was extended for 30 minutes more. Finally, we decided to bring to a close the discussion and agreed that we will continue the topic next year. TechEd India Panel Discussion on Web, Technology and SEO Surprisingly, the day was just beginning after doing all of these. By this time, I have almost met all the MVP who arrived at the event, as well as many Microsoft employees. There were lots of Community folks present, too. I decided that I would go to meet several friends from the Community and continue to communicate with me on SQLAuthority.com. I also met Abhishek Baxi and had a good talk with him regarding Win Mobile and Twitter. He also took a very quick video of me wherein I spoke in my mother’s tongue, Gujarati. It was funny that I talked in Gujarati almost all the day, but when I was talking in the interview I could not find the right Gujarati words to speak. I think we all think in English when we think about Technology, so as to address universality. After meeting them, I headed towards the Speakers’ Dinner. Time: 8:00 PM – onwards Speakers Dinner The Speakers’ dinner was indeed a wonderful opportunity for all the speakers to get together and relax. We talked so many different things, from XBOX to Hindi Movies, and from SQL to Samosas. I just could not express how much fun I had. After a long evening, when I returned tmy room and met Shaivi, I just felt instantly relaxed. Kids are really gifts from God. Today was a really long but exciting day. So many things happened in just one day: Visual Studio Lanch, lunch with Somasegar, 2 technical sessions, 1 panel discussion, community leaders meeting, speakers dinner and, last but not leas,t playing with my child! A perfect day! Day 2 – April 13, 2010 Today started with a bang with the excellent keynote by Kamal Hathi who launched SQL Server 2008 R2 in India and demonstrated the power of PowerPivot to all of us. 101 Million Rows in Excel brought lots of applause from the audience. Kamal Hathi Presenting Keynote at TechEd India 2010 The day was a bit easier one for me. I had no sessions today and no events planned. I had a few meetings planned for the second day of the event. I sat in the speaker’s lounge for half a day and met many people there. I attended nearly 9 different meetings today. The subjects of the meetings were very different. Here is a list of the topics of the Community-related meetings: SQL PASS and its involvement in India and subcontinents How to start community blogging Forums and developing aptitude towards technology Ahmedabad/Gandhinagar User Groups and their developments SharePoint and SQL Business Meeting – a client meeting Business Meeting – a potential performance tuning project Business Meeting – Solid Quality Mentors (SolidQ) And family friends Pinal Dave at TechEd India The day passed by so quickly during this meeting. In the evening, I headed to Partners Expo with friends and checked out few of the booths. I really wanted to talk about some of the products, but due to the freebies there was so much crowd that I finally decided to just take the contact details of the partner. I will now start sending them with my queries and, hopefully, I will have my questions answered. Nupur and Shaivi had also one meeting to attend; it was with our family friend Vijay Raj. Vijay is also a person who loves Technology and loves it more than anybody. I see him growing and learning every day, but still remaining as a ‘human’. I believe that if someone acquires as much knowledge as him, that person will become either a computer or cyborg. Here, Vijay is still a kind gentleman and is able to stay as our close family friend. Shaivi was really happy to play with Uncle Vijay. Pinal Dave and Vijay Raj Renuka Prasad, a Microsoft MVP, impressed me with his passion and knowledge of SQL. Every time he gives me credit for his success, I believe that he is very humble. He has way more certifications than me and has worked many more years with SQL compared to me. He is an excellent photographer as well. Most of the photos in this blog post have been taken by him. I told him if ever he wants to do a part time job, he can do the photography very well. Pinal Dave and Renuka Prasad I also met L Srividya from Microsoft, whom I was looking forward to meet. She is a bundle of knowledge that everyone would surely learn a lot from her. I was able to get a few minutes from her and well, I felt confident. She enlightened me with SQL Server BI concepts, domain management and SQL Server security and few other interesting details. I also had a wonderful time talking about SharePoint with fellow Solid Quality Mentor Joy Rathnayake. He is very passionate about SharePoint but when you talk .NET and SQL with him, he is still overwhelmingly knowledgeable. In fact, while talking to him, I figured out that the recent training he delivered was on SQL Server 2008 R2. I told him a joke that it hurts my ego as he is more popular now in SQL training and consulting than me. I am sure all of you agree that working with good people is a gift from God. I am fortunate enough to work with the best of the best Industry experts. It was a great pleasure to hang out with my Community friends – Ahswin Kini, HimaBindu Vejella, Vasudev G, Suprotim Agrawal, Dhananjay, Vikram Pendse, Mahesh Dhola, Mahesh Mitkari,  Manu Zacharia, Shobhan, Hardik Shah, Ashish Mohta, Manan, Subodh Sohani and Sanjay Shetty (of course!) .  (Please let me know if I have met you at the event and forgot your name to list here). Time: 8:00 PM – onwards Community Leaders Dinner After lots of meetings, I headed towards the Community Leaders dinner meeting and met almost all the folks I met in morning. The discussion was almost the same but the real good thing was that we were enjoying it. The food was really good. Nupur was invited in the event, but Shaivi could not come. When Nupur tried to enter the event, she was stopped as Shaivi did not have the pass to enter the dinner. Nupur expressed that Shaivi is only 8 months old and does not eat outside food as well and could not stay by herself at this age, but the door keeper did not agree and asked that without the entry details Shaivi could not go in, but Nupur could. Nupur called me on phone and asked me to help her out. By the time, I was outside; the organizer of the event reached to the door and happily approved Shaivi to join the party. Once in the party, Shaivi had lots of fun meeting so many people. Shaivi Dave and Abhishek Kant Dean Guida (Infragistics President and CEO) and Pinal Dave (SQLAuthority.com) Day 3 – April 14, 2010 Though, it was last day, I was very much excited today as I was about to present my very favorite session. Query Optimization and Performance Tuning is my domain expertise and I make my leaving by consulting and training the same. Today’s session was on the same subject and as an additional twist, another subject about Spatial Database was presented. I was always intrigued with Spatial Database and I have enjoyed learning about it; however, I have never thought about Spatial Indexing before it was decided that I will do this session. I really thank Solid Quality Mentor Dr. Greg Low for his assistance in helping me prepare the slide deck and also review the content. Furthermore, today was really what I call my ‘learning day’ . So far I had not attended any session in TechEd and I felt a bit down for that. Everybody spends their valuable time & money to learn something new and exciting in TechEd and I had not attended a single session at the moment thinking that it was already last day of the event. I did have a plan for the day and I attended two technical sessions before my session of spatial database. I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller and there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod is syhe speaker. He did not have a single time disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL. Time: 12:30pm-1:30pm SQL Server Query Optimization, Execution and Debugging Query Performance I really had a fun time attending this session. Vinod made this session very interactive. The entire audience really got into the presentation and started participating in the event. Vinod was presenting a small problem with Query Tuning, which any developer would have encountered and solved with their help in such a fashion that a developer feels he or she have already resolved it. In one question, I was the only one who was ready to answer and Vinod told me in a light tone that I am now allowed to answer it! The audience really found it very amusing. There was a huge crowd around Vinod after the session. Vinod – A master storyteller! Time: 3:45pm-4:45pm Data Recovery / consistency with CheckDB This session was much heavier than the earlier one, and I must say this is my most favorite session I EVER attended in India. In this TechEd I have only attended two sessions, but in my career, I have attended numerous technical sessions not only in India, but all over the world. This session had taken my breath away. One by one, Vinod took the different databases, and started to corrupt them in different ways. Each database has some unique ways to get corrupted. Once that was done, Vinod started to show the DBCC CEHCKDB and demonstrated how it can solve your problem. He finally fixed all the databases with this single tool. I do have a good knowledge of this subject, but let me honestly admit that I have learned a lot from this session. I enjoyed and cheered during this session along with other attendees. I had total satisfaction that, just like everyone, I took advantage of the event and learned something. I am now TECHnically EDucated. Pinal Dave and Vinod Kumar After two very interactive and informative SQL Sessions from Vinod Kumar, the next turn me presenting on Spatial Database and Indexing. I got once again nervous but Vinod told me to stay natural and do my presentation. Well, once I got a huge stage with a total of four projectors and a large crowd, I felt better. Time: 5:00pm-6:00pm Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Pinal Presenting session at TechEd India 2010 Pinal Presenting session at TechEd India 2010 I kicked off this session with Michael J Swart‘s beautiful spatial image. This session was the last one for the day but, to my surprise, I had more than 200+ attendees. Slowly, the rain was starting outside and I was worried that the hall would not be full; despite this, there was not a single seat available in the first five minutes of the session. Thanks to all of you for attending my presentation. I had demonstrated the map of world (and India) and quickly explained what  Geographic and Geometry data types in Spatial Database are. This session had interesting story of Indexing and Comparison, as well as how different traditional indexes are from spatial indexing. Pinal Presenting session at TechEd India 2010 Due to the heavy rain during this event, the power went off for about 22 minutes (just an accident – nobodies fault). During these minutes, there were no audio, no video and no light. I continued to address the mass of 200+ people without any audio device and PowerPoint. I must thank the audience because not a single person left from the session. They all stayed in their place, some moved closure to listen to me properly. I noticed that the curiosity and eagerness to learn new things was at the peak even though it was the very last session of the TechEd. Everybody wanted get the maximum knowledge out of this whole event. I was touched by the support from audience. They listened and participated in my session even without any kinds of technology (no ppt, no mike, no AC, nothing). During these 22 minutes, I had completed my theory verbally. Pinal Presenting session at TechEd India 2010 After a while, we got the projector back online and we continued with some exciting demos. Many thanks to Microsoft people who worked energetically in background to get the backup power for project up. I had a very interesting demo wherein I overlaid Bangalore and Hyderabad on the India Map and find their aerial distance between them. After finding the aerial distance, we browsed online and found that SQL Server estimates the exact aerial distance between these two cities, as compared to the factual distance. There was a huge applause from the crowd on the subject that SQL Server takes into the count of the curvature of the earth and finds the precise distances based on details. During the process of finding the distance, I demonstrated a few examples of the indexes where I expressed how one can use those indexes to find these distances and how they can improve the performance of similar query. I also demonstrated few examples wherein we were able to see in which data type the Index is most useful. We finished the demos with a few more internal stuff. Pinal Presenting session at TechEd India 2010 Despite all issues, I was mostly satisfied with my presentation. I think it was the best session I have ever presented at any conference. There was no help from Technology for a while, but I still got lots of appreciation at the end. When we ended the session, the applause from the audience was so loud that for a moment, the rain was not audible. I was truly moved by the dedication of the Technology enthusiasts. Pinal Dave After Presenting session at TechEd India 2010 The abstract of the session is as follows: The Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Time: 8:00 PM – onwards Dinner by Sponsors After the lively session during the day, there was another dinner party courtesy of one of the sponsors of TechEd. All the MVPs and several Community leaders were present at the dinner. I would like to express my gratitude to Abhishek Kant for organizing this wonderful event for us. It was a blast and really relaxing in all angles. We all stayed there for a long time and talked about our sweet and unforgettable memories of the event. Pinal Dave and Bijoy Singhal It was really one wonderful event. After writing this much, I say that I have no words to express about how much I enjoyed TechEd. However, it is true that I shared with you only 1% of the total activities I have done at the event. There were so many people I have met, yet were not mentioned here although I wanted to write their names here, too . Anyway, I have learned so many things and up until now, I am not able to get over all the fun I had in this event. Pinal Dave at TechEd India 2010 The Next Days – April 15, 2010 – till today I am still not able to get my mind out of the whole experience I had at TechEd India 2010. It was like a whole Microsoft Family working together to celebrate a happy occasion. TechEd India – Truly An Unforgettable Experience! Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, SQLServer, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • SQLAuthority News – TechEd India – April 12-14, 2010 Bangalore – An Unforgettable Experience – An Op

    - by pinaldave
    TechEd India was one of the largest Technology events in India led by Microsoft. This event was attended by more than 3,000 technology enthusiasts, making it one of the most well-organized events of the year. Though I attempted to attend almost all the technology events here, I have not seen any bigger or better event in Indian subcontinents other than this. There are 21 Technical Tracks at Tech·Ed India 2010 that span more than 745 learning opportunities. I was fortunate enough to be a part of this whole event as a speaker and a delegate, as well. TechEd India Speaker Badge and A Token of Lifetime Hotel Selection I presented three different sessions at TechEd India and was also a part of panel discussion. (The details of the sessions are given at the end of this blog post.) Due to extensive traveling, I stay away from my family occasionally. For this reason, I took my wife – Nupur and daughter Shaivi (8 months old) to the event along with me. We stayed at the same hotel where the event was organized so as to maximize my time bonding with my family and to have more time in networking with technology community, at the same time. The hotel Lalit Ashok is the largest and most luxurious venue one can find in Bangalore, located in the middle of the city. The cost of the hotel was a bit pricey, but looking at all the advantages, I had decided to ask for a booking there. Hotel Lalit Ashok Nupur Dave and Shaivi Dave Arrival Day – DAY 0 – April 11, 2010 I reached the event a day earlier, and that was one wise decision for I was able to relax a bit and go over my presentation for the next day’s course. I am a kind of person who likes to get everything ready ahead of time. I was also able to enjoy a pleasant evening with several Microsoft employees and my family friends. I even checked out the location where I would be doing presentations the next day. I was fortunate enough to meet Bijoy Singhal from Microsoft who helped me out with a few of the logistics issues that occured the day before. I was not aware of the fact that the very next day he was going to be “The Man” of the TechEd 2010 event. Vinod Kumar from Microsoft was really very kind as he talked to me regarding my subsequent session. He gave me some suggestions which were really helpful that I was able to incorporate them during my presentation. Finally, I was able to meet Abhishek Kant from Microsoft; his valuable suggestions and unlimited passion have inspired many people like me to work with the Community. Pradipta from Microsoft was also around, being extremely busy with logistics; however, in those busy times, he did find some good spare time to have a chat with me and the other Community leaders. I also met Harish Ranganathan and Sachin Rathi, both from Microsoft. It was so interesting to listen to both of them talking about SharePoint. I just have no words to express my overwhelmed spirit because of all these passionate young guys - Pradipta,Vinod, Bijoy, Harish, Sachin and Ahishek (of course!). Map of TechEd India 2010 Event Day 1 – April 12, 2010 From morning until night time, today was truly a very busy day for me. I had two presentations and one panel discussion for the day. Needless to say, I had a few meetings to attend as well. The day started with a keynote from S. Somaseger where he announced the launch of Visual Studio 2010. The keynote area was really eye-catching because of the very large, bigger-than- life uniform screen. This was truly one to show. The title music of the keynote was very interesting and it featured Bijoy Singhal as the model. It was interesting to talk to him afterwards, when we laughed at jokes together about his modeling assignment. TechEd India Keynote Opening Featuring Bijoy TechEd India 2010 Keynote – S. Somasegar Time: 11:15pm – 11:45pm Session 1: True Lies of SQL Server – SQL Myth Buster Following the excellent keynote, I had my very first session on the subject of SQL Server Myth Buster. At first, I was a bit nervous as right after the keynote, for this was my very first session and during my presentation I saw lots of Microsoft Product Team members. Well, it really went well and I had a really good discussion with attendees of the session. I felt that a well begin was half-done and my confidence was regained. Right after the session, I met a few of my Community friends and had meaningful discussions with them on many subjects. The abstract of the session is as follows: In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myths and their resolutions as I back them up with some demo. This demo presentation is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet fun session. Pinal Presenting session at TechEd India 2010 Time: 1:00 PM – 2:00 PM Lunch with Somasegar After the session I went to see my daughter, and then I headed right away to the lunch with S. Somasegar – the keynote speaker and senior vice president of the Developer Division at Microsoft. I really thank to Abhishek who made it possible for us. Because of his efforts, all the MVPs had the opportunity to meet such a legendary person and had to talk with them on Microsoft Technology. Though Somasegar is currently holding such a high position in Microsoft, he is very polite and a real gentleman, and how I wish that everybody in industry is like him. Believe me, if you spread love and kindness, then that is what you will receive back. As soon as lunch time was over, I ran to the session hall as my second presentation was about to start. Time: 2:30pm – 3:30pm Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Business Intelligence is a subject which was widely talked about at TechEd. Everybody was interested in this subject, and I did not excuse myself from this great concept as well. I consider myself fortunate as I was presenting on the subject of Master Data Services at TechEd. When I had initially learned this subject, I had a bit of confusion about the usage of this tool. Later on, I decided that I would tackle about how we all developers and DBAs are not able to understand something so simple such as this, and even worst, creating confusion about the technology. During system designing, it is very important to have a reference material or master lookup tables. Well, I talked about the same subject and presented the session keeping that as my center talk. The session went very well and I received lots of interesting questions. I got many compliments for talking about this subject on the real-life scenario. I really thank Rushabh Mehta (CEO, Solid Quality Mentors India) for his supportive suggestions that helped me prepare the slide deck, as well as the subject. Pinal Presenting session at TechEd India 2010 The abstract of the session is as follows: SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in-depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision-making process by allowing you to create, manage and propagate changes from a single master view of your business entities. Also, MDS – Master Data-hub which is a vital component, helps ensure the consistency of reporting across systems and deliver faster and more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Pinal Presenting session at TechEd India 2010 The day was still not over for me. I had ran into several friends but we were not able keep our enthusiasm under control about all the rumors saying that SQL Server 2008 R2 was about to be launched tomorrow in the keynote. I then ran to my third and final technical event for the day- a panel discussion with the top technologies of India. Time: 5:00pm – 6:00pm Panel Discussion: Harness the power of Web – SEO and Technical Blogging As I have delivered two technical sessions by this time, I was a bit tired but  not less enthusiastic when I had to talk about Blog and Technology. We discussed many different topics there. I told them that the most important aspect for any blog is its content. We discussed in depth the issues with plagiarism and how to avoid it. Another topic of discussion was how we technology bloggers can create awareness in the Community about what the right kind of blogging is and what morally and technically wrong acts are. A couple of questions were raised about what type of liberty a person can have in terms of writing blogs. Well, it was generically agreed that a blog is mainly a representation of our ideas and thoughts; it should not be governed by external entities. As long as one is writing what they really want to say, but not providing incorrect information or not practicing plagiarism, a blogger should be allowed to express himself. This panel discussion was supposed to be over in an hour, but the interest of the participants was remarkable and so it was extended for 30 minutes more. Finally, we decided to bring to a close the discussion and agreed that we will continue the topic next year. TechEd India Panel Discussion on Web, Technology and SEO Surprisingly, the day was just beginning after doing all of these. By this time, I have almost met all the MVP who arrived at the event, as well as many Microsoft employees. There were lots of Community folks present, too. I decided that I would go to meet several friends from the Community and continue to communicate with me on SQLAuthority.com. I also met Abhishek Baxi and had a good talk with him regarding Win Mobile and Twitter. He also took a very quick video of me wherein I spoke in my mother’s tongue, Gujarati. It was funny that I talked in Gujarati almost all the day, but when I was talking in the interview I could not find the right Gujarati words to speak. I think we all think in English when we think about Technology, so as to address universality. After meeting them, I headed towards the Speakers’ Dinner. Time: 8:00 PM – onwards Speakers Dinner The Speakers’ dinner was indeed a wonderful opportunity for all the speakers to get together and relax. We talked so many different things, from XBOX to Hindi Movies, and from SQL to Samosas. I just could not express how much fun I had. After a long evening, when I returned tmy room and met Shaivi, I just felt instantly relaxed. Kids are really gifts from God. Today was a really long but exciting day. So many things happened in just one day: Visual Studio Lanch, lunch with Somasegar, 2 technical sessions, 1 panel discussion, community leaders meeting, speakers dinner and, last but not leas,t playing with my child! A perfect day! Day 2 – April 13, 2010 Today started with a bang with the excellent keynote by Kamal Hathi who launched SQL Server 2008 R2 in India and demonstrated the power of PowerPivot to all of us. 101 Million Rows in Excel brought lots of applause from the audience. Kamal Hathi Presenting Keynote at TechEd India 2010 The day was a bit easier one for me. I had no sessions today and no events planned. I had a few meetings planned for the second day of the event. I sat in the speaker’s lounge for half a day and met many people there. I attended nearly 9 different meetings today. The subjects of the meetings were very different. Here is a list of the topics of the Community-related meetings: SQL PASS and its involvement in India and subcontinents How to start community blogging Forums and developing aptitude towards technology Ahmedabad/Gandhinagar User Groups and their developments SharePoint and SQL Business Meeting – a client meeting Business Meeting – a potential performance tuning project Business Meeting – Solid Quality Mentors (SolidQ) And family friends Pinal Dave at TechEd India The day passed by so quickly during this meeting. In the evening, I headed to Partners Expo with friends and checked out few of the booths. I really wanted to talk about some of the products, but due to the freebies there was so much crowd that I finally decided to just take the contact details of the partner. I will now start sending them with my queries and, hopefully, I will have my questions answered. Nupur and Shaivi had also one meeting to attend; it was with our family friend Vijay Raj. Vijay is also a person who loves Technology and loves it more than anybody. I see him growing and learning every day, but still remaining as a ‘human’. I believe that if someone acquires as much knowledge as him, that person will become either a computer or cyborg. Here, Vijay is still a kind gentleman and is able to stay as our close family friend. Shaivi was really happy to play with Uncle Vijay. Pinal Dave and Vijay Raj Renuka Prasad, a Microsoft MVP, impressed me with his passion and knowledge of SQL. Every time he gives me credit for his success, I believe that he is very humble. He has way more certifications than me and has worked many more years with SQL compared to me. He is an excellent photographer as well. Most of the photos in this blog post have been taken by him. I told him if ever he wants to do a part time job, he can do the photography very well. Pinal Dave and Renuka Prasad I also met L Srividya from Microsoft, whom I was looking forward to meet. She is a bundle of knowledge that everyone would surely learn a lot from her. I was able to get a few minutes from her and well, I felt confident. She enlightened me with SQL Server BI concepts, domain management and SQL Server security and few other interesting details. I also had a wonderful time talking about SharePoint with fellow Solid Quality Mentor Joy Rathnayake. He is very passionate about SharePoint but when you talk .NET and SQL with him, he is still overwhelmingly knowledgeable. In fact, while talking to him, I figured out that the recent training he delivered was on SQL Server 2008 R2. I told him a joke that it hurts my ego as he is more popular now in SQL training and consulting than me. I am sure all of you agree that working with good people is a gift from God. I am fortunate enough to work with the best of the best Industry experts. It was a great pleasure to hang out with my Community friends – Ahswin Kini, HimaBindu Vejella, Vasudev G, Suprotim Agrawal, Dhananjay, Vikram Pendse, Mahesh Dhola, Mahesh Mitkari,  Manu Zacharia, Shobhan, Hardik Shah, Ashish Mohta, Manan, Subodh Sohani and Sanjay Shetty (of course!) .  (Please let me know if I have met you at the event and forgot your name to list here). Time: 8:00 PM – onwards Community Leaders Dinner After lots of meetings, I headed towards the Community Leaders dinner meeting and met almost all the folks I met in morning. The discussion was almost the same but the real good thing was that we were enjoying it. The food was really good. Nupur was invited in the event, but Shaivi could not come. When Nupur tried to enter the event, she was stopped as Shaivi did not have the pass to enter the dinner. Nupur expressed that Shaivi is only 8 months old and does not eat outside food as well and could not stay by herself at this age, but the door keeper did not agree and asked that without the entry details Shaivi could not go in, but Nupur could. Nupur called me on phone and asked me to help her out. By the time, I was outside; the organizer of the event reached to the door and happily approved Shaivi to join the party. Once in the party, Shaivi had lots of fun meeting so many people. Shaivi Dave and Abhishek Kant Dean Guida (Infragistics President and CEO) and Pinal Dave (SQLAuthority.com) Day 3 – April 14, 2010 Though, it was last day, I was very much excited today as I was about to present my very favorite session. Query Optimization and Performance Tuning is my domain expertise and I make my leaving by consulting and training the same. Today’s session was on the same subject and as an additional twist, another subject about Spatial Database was presented. I was always intrigued with Spatial Database and I have enjoyed learning about it; however, I have never thought about Spatial Indexing before it was decided that I will do this session. I really thank Solid Quality Mentor Dr. Greg Low for his assistance in helping me prepare the slide deck and also review the content. Furthermore, today was really what I call my ‘learning day’ . So far I had not attended any session in TechEd and I felt a bit down for that. Everybody spends their valuable time & money to learn something new and exciting in TechEd and I had not attended a single session at the moment thinking that it was already last day of the event. I did have a plan for the day and I attended two technical sessions before my session of spatial database. I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller and there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod is syhe speaker. He did not have a single time disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL. Time: 12:30pm-1:30pm SQL Server Query Optimization, Execution and Debugging Query Performance I really had a fun time attending this session. Vinod made this session very interactive. The entire audience really got into the presentation and started participating in the event. Vinod was presenting a small problem with Query Tuning, which any developer would have encountered and solved with their help in such a fashion that a developer feels he or she have already resolved it. In one question, I was the only one who was ready to answer and Vinod told me in a light tone that I am now allowed to answer it! The audience really found it very amusing. There was a huge crowd around Vinod after the session. Vinod – A master storyteller! Time: 3:45pm-4:45pm Data Recovery / consistency with CheckDB This session was much heavier than the earlier one, and I must say this is my most favorite session I EVER attended in India. In this TechEd I have only attended two sessions, but in my career, I have attended numerous technical sessions not only in India, but all over the world. This session had taken my breath away. One by one, Vinod took the different databases, and started to corrupt them in different ways. Each database has some unique ways to get corrupted. Once that was done, Vinod started to show the DBCC CEHCKDB and demonstrated how it can solve your problem. He finally fixed all the databases with this single tool. I do have a good knowledge of this subject, but let me honestly admit that I have learned a lot from this session. I enjoyed and cheered during this session along with other attendees. I had total satisfaction that, just like everyone, I took advantage of the event and learned something. I am now TECHnically EDucated. Pinal Dave and Vinod Kumar After two very interactive and informative SQL Sessions from Vinod Kumar, the next turn me presenting on Spatial Database and Indexing. I got once again nervous but Vinod told me to stay natural and do my presentation. Well, once I got a huge stage with a total of four projectors and a large crowd, I felt better. Time: 5:00pm-6:00pm Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Pinal Presenting session at TechEd India 2010 Pinal Presenting session at TechEd India 2010 I kicked off this session with Michael J Swart‘s beautiful spatial image. This session was the last one for the day but, to my surprise, I had more than 200+ attendees. Slowly, the rain was starting outside and I was worried that the hall would not be full; despite this, there was not a single seat available in the first five minutes of the session. Thanks to all of you for attending my presentation. I had demonstrated the map of world (and India) and quickly explained what  Geographic and Geometry data types in Spatial Database are. This session had interesting story of Indexing and Comparison, as well as how different traditional indexes are from spatial indexing. Pinal Presenting session at TechEd India 2010 Due to the heavy rain during this event, the power went off for about 22 minutes (just an accident – nobodies fault). During these minutes, there were no audio, no video and no light. I continued to address the mass of 200+ people without any audio device and PowerPoint. I must thank the audience because not a single person left from the session. They all stayed in their place, some moved closure to listen to me properly. I noticed that the curiosity and eagerness to learn new things was at the peak even though it was the very last session of the TechEd. Everybody wanted get the maximum knowledge out of this whole event. I was touched by the support from audience. They listened and participated in my session even without any kinds of technology (no ppt, no mike, no AC, nothing). During these 22 minutes, I had completed my theory verbally. Pinal Presenting session at TechEd India 2010 After a while, we got the projector back online and we continued with some exciting demos. Many thanks to Microsoft people who worked energetically in background to get the backup power for project up. I had a very interesting demo wherein I overlaid Bangalore and Hyderabad on the India Map and find their aerial distance between them. After finding the aerial distance, we browsed online and found that SQL Server estimates the exact aerial distance between these two cities, as compared to the factual distance. There was a huge applause from the crowd on the subject that SQL Server takes into the count of the curvature of the earth and finds the precise distances based on details. During the process of finding the distance, I demonstrated a few examples of the indexes where I expressed how one can use those indexes to find these distances and how they can improve the performance of similar query. I also demonstrated few examples wherein we were able to see in which data type the Index is most useful. We finished the demos with a few more internal stuff. Pinal Presenting session at TechEd India 2010 Despite all issues, I was mostly satisfied with my presentation. I think it was the best session I have ever presented at any conference. There was no help from Technology for a while, but I still got lots of appreciation at the end. When we ended the session, the applause from the audience was so loud that for a moment, the rain was not audible. I was truly moved by the dedication of the Technology enthusiasts. Pinal Dave After Presenting session at TechEd India 2010 The abstract of the session is as follows: The Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Time: 8:00 PM – onwards Dinner by Sponsors After the lively session during the day, there was another dinner party courtesy of one of the sponsors of TechEd. All the MVPs and several Community leaders were present at the dinner. I would like to express my gratitude to Abhishek Kant for organizing this wonderful event for us. It was a blast and really relaxing in all angles. We all stayed there for a long time and talked about our sweet and unforgettable memories of the event. Pinal Dave and Bijoy Singhal It was really one wonderful event. After writing this much, I say that I have no words to express about how much I enjoyed TechEd. However, it is true that I shared with you only 1% of the total activities I have done at the event. There were so many people I have met, yet were not mentioned here although I wanted to write their names here, too . Anyway, I have learned so many things and up until now, I am not able to get over all the fun I had in this event. Pinal Dave at TechEd India 2010 The Next Days – April 15, 2010 – till today I am still not able to get my mind out of the whole experience I had at TechEd India 2010. It was like a whole Microsoft Family working together to celebrate a happy occasion. TechEd India – Truly An Unforgettable Experience! Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, SQLServer, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • 256 Windows Azure Worker Roles, Windows Kinect and a 90's Text-Based Ray-Tracer

    - by Alan Smith
    For a couple of years I have been demoing a simple render farm hosted in Windows Azure using worker roles and the Azure Storage service. At the start of the presentation I deploy an Azure application that uses 16 worker roles to render a 1,500 frame 3D ray-traced animation. At the end of the presentation, when the animation was complete, I would play the animation delete the Azure deployment. The standing joke with the audience was that it was that it was a “$2 demo”, as the compute charges for running the 16 instances for an hour was $1.92, factor in the bandwidth charges and it’s a couple of dollars. The point of the demo is that it highlights one of the great benefits of cloud computing, you pay for what you use, and if you need massive compute power for a short period of time using Windows Azure can work out very cost effective. The “$2 demo” was great for presenting at user groups and conferences in that it could be deployed to Azure, used to render an animation, and then removed in a one hour session. I have always had the idea of doing something a bit more impressive with the demo, and scaling it from a “$2 demo” to a “$30 demo”. The challenge was to create a visually appealing animation in high definition format and keep the demo time down to one hour.  This article will take a run through how I achieved this. Ray Tracing Ray tracing, a technique for generating high quality photorealistic images, gained popularity in the 90’s with companies like Pixar creating feature length computer animations, and also the emergence of shareware text-based ray tracers that could run on a home PC. In order to render a ray traced image, the ray of light that would pass from the view point must be tracked until it intersects with an object. At the intersection, the color, reflectiveness, transparency, and refractive index of the object are used to calculate if the ray will be reflected or refracted. Each pixel may require thousands of calculations to determine what color it will be in the rendered image. Pin-Board Toys Having very little artistic talent and a basic understanding of maths I decided to focus on an animation that could be modeled fairly easily and would look visually impressive. I’ve always liked the pin-board desktop toys that become popular in the 80’s and when I was working as a 3D animator back in the 90’s I always had the idea of creating a 3D ray-traced animation of a pin-board, but never found the energy to do it. Even if I had a go at it, the render time to produce an animation that would look respectable on a 486 would have been measured in months. PolyRay Back in 1995 I landed my first real job, after spending three years being a beach-ski-climbing-paragliding-bum, and was employed to create 3D ray-traced animations for a CD-ROM that school kids would use to learn physics. I had got into the strange and wonderful world of text-based ray tracing, and was using a shareware ray-tracer called PolyRay. PolyRay takes a text file describing a scene as input and, after a few hours processing on a 486, produced a high quality ray-traced image. The following is an example of a basic PolyRay scene file. background Midnight_Blue   static define matte surface { ambient 0.1 diffuse 0.7 } define matte_white texture { matte { color white } } define matte_black texture { matte { color dark_slate_gray } } define position_cylindrical 3 define lookup_sawtooth 1 define light_wood <0.6, 0.24, 0.1> define median_wood <0.3, 0.12, 0.03> define dark_wood <0.05, 0.01, 0.005>     define wooden texture { noise surface { ambient 0.2  diffuse 0.7  specular white, 0.5 microfacet Reitz 10 position_fn position_cylindrical position_scale 1  lookup_fn lookup_sawtooth octaves 1 turbulence 1 color_map( [0.0, 0.2, light_wood, light_wood] [0.2, 0.3, light_wood, median_wood] [0.3, 0.4, median_wood, light_wood] [0.4, 0.7, light_wood, light_wood] [0.7, 0.8, light_wood, median_wood] [0.8, 0.9, median_wood, light_wood] [0.9, 1.0, light_wood, dark_wood]) } } define glass texture { surface { ambient 0 diffuse 0 specular 0.2 reflection white, 0.1 transmission white, 1, 1.5 }} define shiny surface { ambient 0.1 diffuse 0.6 specular white, 0.6 microfacet Phong 7  } define steely_blue texture { shiny { color black } } define chrome texture { surface { color white ambient 0.0 diffuse 0.2 specular 0.4 microfacet Phong 10 reflection 0.8 } }   viewpoint {     from <4.000, -1.000, 1.000> at <0.000, 0.000, 0.000> up <0, 1, 0> angle 60     resolution 640, 480 aspect 1.6 image_format 0 }       light <-10, 30, 20> light <-10, 30, -20>   object { disc <0, -2, 0>, <0, 1, 0>, 30 wooden }   object { sphere <0.000, 0.000, 0.000>, 1.00 chrome } object { cylinder <0.000, 0.000, 0.000>, <0.000, 0.000, -4.000>, 0.50 chrome }   After setting up the background and defining colors and textures, the viewpoint is specified. The “camera” is located at a point in 3D space, and it looks towards another point. The angle, image resolution, and aspect ratio are specified. Two lights are present in the image at defined coordinates. The three objects in the image are a wooden disc to represent a table top, and a sphere and cylinder that intersect to form a pin that will be used for the pin board toy in the final animation. When the image is rendered, the following image is produced. The pins are modeled with a chrome surface, so they reflect the environment around them. Note that the scale of the pin shaft is not correct, this will be fixed later. Modeling the Pin Board The frame of the pin-board is made up of three boxes, and six cylinders, the front box is modeled using a clear, slightly reflective solid, with the same refractive index of glass. The other shapes are modeled as metal. object { box <-5.5, -1.5, 1>, <5.5, 5.5, 1.2> glass } object { box <-5.5, -1.5, -0.04>, <5.5, 5.5, -0.09> steely_blue } object { box <-5.5, -1.5, -0.52>, <5.5, 5.5, -0.59> steely_blue } object { cylinder <-5.2, -1.2, 1.4>, <-5.2, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <5.2, -1.2, 1.4>, <5.2, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <-5.2, 5.2, 1.4>, <-5.2, 5.2, -0.74>, 0.2 steely_blue } object { cylinder <5.2, 5.2, 1.4>, <5.2, 5.2, -0.74>, 0.2 steely_blue } object { cylinder <0, -1.2, 1.4>, <0, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <0, 5.2, 1.4>, <0, 5.2, -0.74>, 0.2 steely_blue }   In order to create the matrix of pins that make up the pin board I used a basic console application with a few nested loops to create two intersecting matrixes of pins, which models the layout used in the pin boards. The resulting image is shown below. The pin board contains 11,481 pins, with the scene file containing 23,709 lines of code. For the complete animation 2,000 scene files will be created, which is over 47 million lines of code. Each pin in the pin-board will slide out a specific distance when an object is pressed into the back of the board. This is easily modeled by setting the Z coordinate of the pin to a specific value. In order to set all of the pins in the pin-board to the correct position, a bitmap image can be used. The position of the pin can be set based on the color of the pixel at the appropriate position in the image. When the Windows Azure logo is used to set the Z coordinate of the pins, the following image is generated. The challenge now was to make a cool animation. The Azure Logo is fine, but it is static. Using a normal video to animate the pins would not work; the colors in the video would not be the same as the depth of the objects from the camera. In order to simulate the pin board accurately a series of frames from a depth camera could be used. Windows Kinect The Kenect controllers for the X-Box 360 and Windows feature a depth camera. The Kinect SDK for Windows provides a programming interface for Kenect, providing easy access for .NET developers to the Kinect sensors. The Kinect Explorer provided with the Kinect SDK is a great starting point for exploring Kinect from a developers perspective. Both the X-Box 360 Kinect and the Windows Kinect will work with the Kinect SDK, the Windows Kinect is required for commercial applications, but the X-Box Kinect can be used for hobby projects. The Windows Kinect has the advantage of providing a mode to allow depth capture with objects closer to the camera, which makes for a more accurate depth image for setting the pin positions. Creating a Depth Field Animation The depth field animation used to set the positions of the pin in the pin board was created using a modified version of the Kinect Explorer sample application. In order to simulate the pin board accurately, a small section of the depth range from the depth sensor will be used. Any part of the object in front of the depth range will result in a white pixel; anything behind the depth range will be black. Within the depth range the pixels in the image will be set to RGB values from 0,0,0 to 255,255,255. A screen shot of the modified Kinect Explorer application is shown below. The Kinect Explorer sample application was modified to include slider controls that are used to set the depth range that forms the image from the depth stream. This allows the fine tuning of the depth image that is required for simulating the position of the pins in the pin board. The Kinect Explorer was also modified to record a series of images from the depth camera and save them as a sequence JPEG files that will be used to animate the pins in the animation the Start and Stop buttons are used to start and stop the image recording. En example of one of the depth images is shown below. Once a series of 2,000 depth images has been captured, the task of creating the animation can begin. Rendering a Test Frame In order to test the creation of frames and get an approximation of the time required to render each frame a test frame was rendered on-premise using PolyRay. The output of the rendering process is shown below. The test frame contained 23,629 primitive shapes, most of which are the spheres and cylinders that are used for the 11,800 or so pins in the pin board. The 1280x720 image contains 921,600 pixels, but as anti-aliasing was used the number of rays that were calculated was 4,235,777, with 3,478,754,073 object boundaries checked. The test frame of the pin board with the depth field image applied is shown below. The tracing time for the test frame was 4 minutes 27 seconds, which means rendering the2,000 frames in the animation would take over 148 hours, or a little over 6 days. Although this is much faster that an old 486, waiting almost a week to see the results of an animation would make it challenging for animators to create, view, and refine their animations. It would be much better if the animation could be rendered in less than one hour. Windows Azure Worker Roles The cost of creating an on-premise render farm to render animations increases in proportion to the number of servers. The table below shows the cost of servers for creating a render farm, assuming a cost of $500 per server. Number of Servers Cost 1 $500 16 $8,000 256 $128,000   As well as the cost of the servers, there would be additional costs for networking, racks etc. Hosting an environment of 256 servers on-premise would require a server room with cooling, and some pretty hefty power cabling. The Windows Azure compute services provide worker roles, which are ideal for performing processor intensive compute tasks. With the scalability available in Windows Azure a job that takes 256 hours to complete could be perfumed using different numbers of worker roles. The time and cost of using 1, 16 or 256 worker roles is shown below. Number of Worker Roles Render Time Cost 1 256 hours $30.72 16 16 hours $30.72 256 1 hour $30.72   Using worker roles in Windows Azure provides the same cost for the 256 hour job, irrespective of the number of worker roles used. Provided the compute task can be broken down into many small units, and the worker role compute power can be used effectively, it makes sense to scale the application so that the task is completed quickly, making the results available in a timely fashion. The task of rendering 2,000 frames in an animation is one that can easily be broken down into 2,000 individual pieces, which can be performed by a number of worker roles. Creating a Render Farm in Windows Azure The architecture of the render farm is shown in the following diagram. The render farm is a hybrid application with the following components: ·         On-Premise o   Windows Kinect – Used combined with the Kinect Explorer to create a stream of depth images. o   Animation Creator – This application uses the depth images from the Kinect sensor to create scene description files for PolyRay. These files are then uploaded to the jobs blob container, and job messages added to the jobs queue. o   Process Monitor – This application queries the role instance lifecycle table and displays statistics about the render farm environment and render process. o   Image Downloader – This application polls the image queue and downloads the rendered animation files once they are complete. ·         Windows Azure o   Azure Storage – Queues and blobs are used for the scene description files and completed frames. A table is used to store the statistics about the rendering environment.   The architecture of each worker role is shown below.   The worker role is configured to use local storage, which provides file storage on the worker role instance that can be use by the applications to render the image and transform the format of the image. The service definition for the worker role with the local storage configuration highlighted is shown below. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="CloudRay" >   <WorkerRole name="CloudRayWorkerRole" vmsize="Small">     <Imports>     </Imports>     <ConfigurationSettings>       <Setting name="DataConnectionString" />     </ConfigurationSettings>     <LocalResources>       <LocalStorage name="RayFolder" cleanOnRoleRecycle="true" />     </LocalResources>   </WorkerRole> </ServiceDefinition>     The two executable programs, PolyRay.exe and DTA.exe are included in the Azure project, with Copy Always set as the property. PolyRay will take the scene description file and render it to a Truevision TGA file. As the TGA format has not seen much use since the mid 90’s it is converted to a JPG image using Dave's Targa Animator, another shareware application from the 90’s. Each worker roll will use the following process to render the animation frames. 1.       The worker process polls the job queue, if a job is available the scene description file is downloaded from blob storage to local storage. 2.       PolyRay.exe is started in a process with the appropriate command line arguments to render the image as a TGA file. 3.       DTA.exe is started in a process with the appropriate command line arguments convert the TGA file to a JPG file. 4.       The JPG file is uploaded from local storage to the images blob container. 5.       A message is placed on the images queue to indicate a new image is available for download. 6.       The job message is deleted from the job queue. 7.       The role instance lifecycle table is updated with statistics on the number of frames rendered by the worker role instance, and the CPU time used. The code for this is shown below. public override void Run() {     // Set environment variables     string polyRayPath = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), PolyRayLocation);     string dtaPath = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), DTALocation);       LocalResource rayStorage = RoleEnvironment.GetLocalResource("RayFolder");     string localStorageRootPath = rayStorage.RootPath;       JobQueue jobQueue = new JobQueue("renderjobs");     JobQueue downloadQueue = new JobQueue("renderimagedownloadjobs");     CloudRayBlob sceneBlob = new CloudRayBlob("scenes");     CloudRayBlob imageBlob = new CloudRayBlob("images");     RoleLifecycleDataSource roleLifecycleDataSource = new RoleLifecycleDataSource();       Frames = 0;       while (true)     {         // Get the render job from the queue         CloudQueueMessage jobMsg = jobQueue.Get();           if (jobMsg != null)         {             // Get the file details             string sceneFile = jobMsg.AsString;             string tgaFile = sceneFile.Replace(".pi", ".tga");             string jpgFile = sceneFile.Replace(".pi", ".jpg");               string sceneFilePath = Path.Combine(localStorageRootPath, sceneFile);             string tgaFilePath = Path.Combine(localStorageRootPath, tgaFile);             string jpgFilePath = Path.Combine(localStorageRootPath, jpgFile);               // Copy the scene file to local storage             sceneBlob.DownloadFile(sceneFilePath);               // Run the ray tracer.             string polyrayArguments =                 string.Format("\"{0}\" -o \"{1}\" -a 2", sceneFilePath, tgaFilePath);             Process polyRayProcess = new Process();             polyRayProcess.StartInfo.FileName =                 Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), polyRayPath);             polyRayProcess.StartInfo.Arguments = polyrayArguments;             polyRayProcess.Start();             polyRayProcess.WaitForExit();               // Convert the image             string dtaArguments =                 string.Format(" {0} /FJ /P{1}", tgaFilePath, Path.GetDirectoryName (jpgFilePath));             Process dtaProcess = new Process();             dtaProcess.StartInfo.FileName =                 Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), dtaPath);             dtaProcess.StartInfo.Arguments = dtaArguments;             dtaProcess.Start();             dtaProcess.WaitForExit();               // Upload the image to blob storage             imageBlob.UploadFile(jpgFilePath);               // Add a download job.             downloadQueue.Add(jpgFile);               // Delete the render job message             jobQueue.Delete(jobMsg);               Frames++;         }         else         {             Thread.Sleep(1000);         }           // Log the worker role activity.         roleLifecycleDataSource.Alive             ("CloudRayWorker", RoleLifecycleDataSource.RoleLifecycleId, Frames);     } }     Monitoring Worker Role Instance Lifecycle In order to get more accurate statistics about the lifecycle of the worker role instances used to render the animation data was tracked in an Azure storage table. The following class was used to track the worker role lifecycles in Azure storage.   public class RoleLifecycle : TableServiceEntity {     public string ServerName { get; set; }     public string Status { get; set; }     public DateTime StartTime { get; set; }     public DateTime EndTime { get; set; }     public long SecondsRunning { get; set; }     public DateTime LastActiveTime { get; set; }     public int Frames { get; set; }     public string Comment { get; set; }       public RoleLifecycle()     {     }       public RoleLifecycle(string roleName)     {         PartitionKey = roleName;         RowKey = Utils.GetAscendingRowKey();         Status = "Started";         StartTime = DateTime.UtcNow;         LastActiveTime = StartTime;         EndTime = StartTime;         SecondsRunning = 0;         Frames = 0;     } }     A new instance of this class is created and added to the storage table when the role starts. It is then updated each time the worker renders a frame to record the total number of frames rendered and the total processing time. These statistics are used be the monitoring application to determine the effectiveness of use of resources in the render farm. Rendering the Animation The Azure solution was deployed to Windows Azure with the service configuration set to 16 worker role instances. This allows for the application to be tested in the cloud environment, and the performance of the application determined. When I demo the application at conferences and user groups I often start with 16 instances, and then scale up the application to the full 256 instances. The configuration to run 16 instances is shown below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="CloudRay" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">   <Role name="CloudRayWorkerRole">     <Instances count="16" />     <ConfigurationSettings>       <Setting name="DataConnectionString"         value="DefaultEndpointsProtocol=https;AccountName=cloudraydata;AccountKey=..." />     </ConfigurationSettings>   </Role> </ServiceConfiguration>     About six minutes after deploying the application the first worker roles become active and start to render the first frames of the animation. The CloudRay Monitor application displays an icon for each worker role instance, with a number indicating the number of frames that the worker role has rendered. The statistics on the left show the number of active worker roles and statistics about the render process. The render time is the time since the first worker role became active; the CPU time is the total amount of processing time used by all worker role instances to render the frames.   Five minutes after the first worker role became active the last of the 16 worker roles activated. By this time the first seven worker roles had each rendered one frame of the animation.   With 16 worker roles u and running it can be seen that one hour and 45 minutes CPU time has been used to render 32 frames with a render time of just under 10 minutes.     At this rate it would take over 10 hours to render the 2,000 frames of the full animation. In order to complete the animation in under an hour more processing power will be required. Scaling the render farm from 16 instances to 256 instances is easy using the new management portal. The slider is set to 256 instances, and the configuration saved. We do not need to re-deploy the application, and the 16 instances that are up and running will not be affected. Alternatively, the configuration file for the Azure service could be modified to specify 256 instances.   <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="CloudRay" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">   <Role name="CloudRayWorkerRole">     <Instances count="256" />     <ConfigurationSettings>       <Setting name="DataConnectionString"         value="DefaultEndpointsProtocol=https;AccountName=cloudraydata;AccountKey=..." />     </ConfigurationSettings>   </Role> </ServiceConfiguration>     Six minutes after the new configuration has been applied 75 new worker roles have activated and are processing their first frames.   Five minutes later the full configuration of 256 worker roles is up and running. We can see that the average rate of frame rendering has increased from 3 to 12 frames per minute, and that over 17 hours of CPU time has been utilized in 23 minutes. In this test the time to provision 140 worker roles was about 11 minutes, which works out at about one every five seconds.   We are now half way through the rendering, with 1,000 frames complete. This has utilized just under three days of CPU time in a little over 35 minutes.   The animation is now complete, with 2,000 frames rendered in a little over 52 minutes. The CPU time used by the 256 worker roles is 6 days, 7 hours and 22 minutes with an average frame rate of 38 frames per minute. The rendering of the last 1,000 frames took 16 minutes 27 seconds, which works out at a rendering rate of 60 frames per minute. The frame counts in the server instances indicate that the use of a queue to distribute the workload has been very effective in distributing the load across the 256 worker role instances. The first 16 instances that were deployed first have rendered between 11 and 13 frames each, whilst the 240 instances that were added when the application was scaled have rendered between 6 and 9 frames each.   Completed Animation I’ve uploaded the completed animation to YouTube, a low resolution preview is shown below. Pin Board Animation Created using Windows Kinect and 256 Windows Azure Worker Roles   The animation can be viewed in 1280x720 resolution at the following link: http://www.youtube.com/watch?v=n5jy6bvSxWc Effective Use of Resources According to the CloudRay monitor statistics the animation took 6 days, 7 hours and 22 minutes CPU to render, this works out at 152 hours of compute time, rounded up to the nearest hour. As the usage for the worker role instances are billed for the full hour, it may have been possible to render the animation using fewer than 256 worker roles. When deciding the optimal usage of resources, the time required to provision and start the worker roles must also be considered. In the demo I started with 16 worker roles, and then scaled the application to 256 worker roles. It would have been more optimal to start the application with maybe 200 worker roles, and utilized the full hour that I was being billed for. This would, however, have prevented showing the ease of scalability of the application. The new management portal displays the CPU usage across the worker roles in the deployment. The average CPU usage across all instances is 93.27%, with over 99% used when all the instances are up and running. This shows that the worker role resources are being used very effectively. Grid Computing Scenarios Although I am using this scenario for a hobby project, there are many scenarios where a large amount of compute power is required for a short period of time. Windows Azure provides a great platform for developing these types of grid computing applications, and can work out very cost effective. ·         Windows Azure can provide massive compute power, on demand, in a matter of minutes. ·         The use of queues to manage the load balancing of jobs between role instances is a simple and effective solution. ·         Using a cloud-computing platform like Windows Azure allows proof-of-concept scenarios to be tested and evaluated on a very low budget. ·         No charges for inbound data transfer makes the uploading of large data sets to Windows Azure Storage services cost effective. (Transaction charges still apply.) Tips for using Windows Azure for Grid Computing Scenarios I found the implementation of a render farm using Windows Azure a fairly simple scenario to implement. I was impressed by ease of scalability that Azure provides, and by the short time that the application took to scale from 16 to 256 worker role instances. In this case it was around 13 minutes, in other tests it took between 10 and 20 minutes. The following tips may be useful when implementing a grid computing project in Windows Azure. ·         Using an Azure Storage queue to load-balance the units of work across multiple worker roles is simple and very effective. The design I have used in this scenario could easily scale to many thousands of worker role instances. ·         Windows Azure accounts are typically limited to 20 cores. If you need to use more than this, a call to support and a credit card check will be required. ·         Be aware of how the billing model works. You will be charged for worker role instances for the full clock our in which the instance is deployed. Schedule the workload to start just after the clock hour has started. ·         Monitor the utilization of the resources you are provisioning, ensure that you are not paying for worker roles that are idle. ·         If you are deploying third party applications to worker roles, you may well run into licensing issues. Purchasing software licenses on a per-processor basis when using hundreds of processors for a short time period would not be cost effective. ·         Third party software may also require installation onto the worker roles, which can be accomplished using start-up tasks. Bear in mind that adding a startup task and possible re-boot will add to the time required for the worker role instance to start and activate. An alternative may be to use a prepared VM and use VM roles. ·         Consider using the Windows Azure Autoscaling Application Block (WASABi) to autoscale the worker roles in your application. When using a large number of worker roles, the utilization must be carefully monitored, if the scaling algorithms are not optimal it could get very expensive!

    Read the article

  • ubuntu mount iso but some files are unreadable

    - by Chao
    I'm new to Linux and just installed ubuntu 12.04 amd64 this month. I had failed installing Texlive with texlive2012 iso image. I used the recommended command to mount: "mount -t iso9660 -o ro,loop,noauto /your/texlive2012.iso /mnt " But the installer failed to read some file. The iso is fine, I checked the md5. I extracted everything from iso with archive manager, and it installed successfully. So, why mount is not working? Thanks. UPDATE with furius iso mount tool, mount with Fuse, and it installed, with warning: Summary of warning messages during installation: Downloaded ./archive/calligra-type1.tar.xz, size equal, but md5sum differs; downloading again. While mount with Loop, it failed to install. Updated Error message from terminal, mounted with furius iso mount, loop. texlive2012-20120701_iso$ ./install-tl -gui Loading ./tlpkg/texlive.tlpdb Installing TeX Live 2012 from: . Platform: x86_64-linux = 'x86_64 with GNU/Linux' Distribution: inst (compressed) Directory for temporary files: /tmp Installing [0001/2481, time/total: ??:??/??:??]: 12many [3k] Installing [0002/2481, time/total: 00:00/00:00]: 2up [4k] Installing [0003/2481, time/total: 00:00/00:00]: Asana-Math [457k] Installing [0004/2481, time/total: 00:00/00:00]: ESIEEcv [2k] ... Installing [0265/2481, time/total: 00:10/01:09]: calctab [5k] Installing [0266/2481, time/total: 00:10/01:09]: calligra [42k] Installing [0267/2481, time/total: 00:10/01:09]: calligra-type1 [59k] Downloaded ./archive/calligra-type1.tar.xz, size equal, but md5sum differs; downloading again. ./tlpkg/installer/xz/xzdec.x86_64-linux: (stdin): File is corrupt tar: Unexpected EOF in archive tar: rmtlseek not stopped at a record boundary tar: Error is not recoverable: exiting now untar: untarring /home/lichao/ttt/temp/calligra-type1.tar failed (in /home/lichao/ttt/texmf-dist) untarring /home/lichao/ttt/temp/calligra-type1.tar failed, stopping install. Installation failed. Rerunning the installer will try to restart the installation. Or you can restart by running the installer with: install-tl --profile installation.profile [EXTRA-ARGS] ./install-tl: Could not write to install-tl.log, so flushing messages to stderr. Loading ./tlpkg/texlive.tlpdb Installing TeX Live 2012 from: . Platform: x86_64-linux = 'x86_64 with GNU/Linux' Distribution: inst (compressed) Directory for temporary files: /tmp Installer revision: 26794 Database revision: 26935 Installing [0001/2481, time/total: ??:??/??:??]: 12many [3k] Installing [0002/2481, time/total: 00:00/00:00]: 2up [4k] Installing [0003/2481, time/total: 00:00/00:00]: Asana-Math [457k] Installing [0004/2481, time/total: 00:00/00:00]: ESIEEcv [2k] Installing [0005/2481, time/total: 00:00/00:00]: FAQ-en [1k] ... Installing [0262/2481, time/total: 00:10/01:09]: c90 [2k] Installing [0263/2481, time/total: 00:10/01:09]: cachepic [5k] Installing [0264/2481, time/total: 00:10/01:09]: cachepic.x86_64-linux [1k] Installing [0265/2481, time/total: 00:10/01:09]: calctab [5k] Installing [0266/2481, time/total: 00:10/01:09]: calligra [42k] Installing [0267/2481, time/total: 00:10/01:09]: calligra-type1 [59k] Downloaded ./archive/calligra-type1.tar.xz, size equal, but md5sum differs; downloading again. untar: untarring /home/lichao/ttt/temp/calligra-type1.tar failed (in /home/lichao/ttt/texmf-dist) untarring /home/lichao/ttt/temp/calligra-type1.tar failed, stopping install. Installation failed. Rerunning the installer will try to restart the installation. Or you can restart by running the installer with: install-tl --profile installation.profile [EXTRA-ARGS] Segmentation fault (core dumped) I am sure that the iso is fine. I can open it with archive manager and all files are good. But after mounting it, even archive manager failed to open some files (which can be opened when the iso is opened in archive manager).

    Read the article

  • Why does Java's invokevirtual need to resolve the called method's compile-time class?

    - by Chris
    Consider this simple Java class: class MyClass { public void bar(MyClass c) { c.foo(); } } I want to discuss what happens on the line c.foo(). At the bytecode level, the meat of c.foo() will be the invokevirtual opcode, and, according to the documentation for invokevirtual, more or less the following will happen: Look up the foo method defined in compile-time class MyClass. (This involves first resolving MyClass.) Do some checks, including: Verify that c is not an initialization method, and verify that calling MyClass.foo wouldn't violate any protected modifiers. Figure out which method to actually call. In particular, look up c's runtime type. If that type has foo(), call that method and return. If not, look up c's runtime type's superclass; if that type has foo, call that method and return. If not, look up c's runtime type's superclass's superclass; if that type has foo, call that method and return. Etc.. If no suitable method can be found, then error. Step #3 alone seems adequate for figuring out which method to call and verifying that said method has the correct argument/return types. So my question is why step #1 gets performed in the first place. Possible answers seem to be: You don't have enough information to perform step #3 until step #1 is complete. (This seems implausible at first glance, so please explain.) The linking or access modifier checks done in #1 and #2 are essential to prevent certain bad things from happening, and those checks must be performed based on the compile-time type, rather than the run-time type hierarchy. (Please explain.)

    Read the article

  • In Ruby, what is good way to filter all the methods of an object that contain the word "time" in it?

    - by Jian Lin
    I tried the following and it was partly working: >> s.methods.map {|n| n if n =~ /time/} => [nil, nil, nil, nil, nil, nil, nil, nil, "skip_time_zone_conversion_for_attri butes", nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, ni l, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, ni l, "timestamped_migrations", nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, "time_zone_aware _attributes", nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, "default_timezone", nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, n il, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, "recor d_timestamps", nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil] >> s.methods.each {|n| p n if n =~ /time/} "skip_time_zone_conversion_for_attributes" "timestamped_migrations" "time_zone_aware_attributes" "default_timezone" "record_timestamps" => ["extended_by", "before_create", "vote_ids=", "save_without_dirty", "_delete" , "touch", "daemonize", "after_destroy", "skip_time_zone_conversion_for_attribut es", "methods", "send", "to_query", "becomes", "after_validation", "store_full_s ti_class?", "save_with_transactions!", "autosave_associated_records_for_votes", "require_library_or_gem", "enum_for", "taint", "instance_variable_defined?", "ac [...] and the rest of the whole array >> s.methods.filter {|n| n =~ /time/} NoMethodError: undefined method `filter' for #<Array:0x4de6b00> from (irb):93

    Read the article

  • How do you obtain a formatted date and time for the current locale in C?

    - by jwaddell
    What C function should I call to obtain a formatted date and time for the locale where the program is being executed? I'm asking this question because I have run into a problem using the ClamAV daemon API. The VERSION command returns the date and time of the latest virus definitions, but the code uses a call to ctime to format it. As far as I can tell ctime does not format the datetime according to the current locale and uses the English abbreviations for days of the week and the month in the returned string. This causes problems as my Java program which uses the ClamAV API does respect the current locale and thus expects the day of the week and month name to have the local abbreviations. The datetime format would need to be in the same format as that produced by ctime: Www Mmm dd hh:mm:ss yyyy Where Www is the weekday, Mmm the month in letters, dd the day of the month, hh:mm:ss the time, and yyyy the year. I could rewrite the Java program to always assume English dates but I'd be happier to submit a patch to ClamAV as it seems like a bug on their side to me.

    Read the article

  • How accurately (in terms of time) does Windows play audio?

    - by MusiGenesis
    Let's say I play a stereo WAV file with 317,520,000 samples, which is theoretically 1 hour long. Assuming no interruptions of the playback, will the file finish playing in exactly one hour, or is there some occasional tiny variation in the playback speed such that it would be slightly more or slightly less (by some number of milliseconds) than one hour? I am trying to synchronize animation with audio, and I am using a System.Diagnostics.Stopwatch to keep the frames matching the audio. But if the playback speed of WAV audio in Windows can vary slightly over time, then the audio will drift out of sync with the Stopwatch-driven animation. Which leads to a second question: it appears that a Stopwatch - while highly granular and accurate for short durations - runs slightly fast. On my laptop, a Stopwatch run for exactly 24 hours (as measured by the computer's system time and a real stopwatch) shows an elapsed time of 24 hours plus about 5 seconds (not milliseconds). Is this a known problem with Stopwatch? (A related question would be "am I crazy?", but you can try it for yourself.) Given its usage as a diagnostics tool, I can see where a discrepancy like this would only show up when measuring long durations, for which most people would use something other than a Stopwatch. If I'm really lucky, then both Stopwatch and audio playback are driven by the same underlying mechanism, and thus will stay in sync with each other for days on end. Any chance this is true?

    Read the article

  • How to parse time stamps with Unicode characters in Java or Perl?

    - by ram
    I'm trying to make my code as generic as possible. I'm trying to parse install time of a product installation. I will have two files in the product, one that has time stamp I need to parse and other file tells the language of the installation. This is how I'm parsing the timestamp public class ts { public static void main (String[] args){ String installTime = "2009/11/26 \u4e0b\u5348 04:40:54"; //This timestamp I got from the first file. Those unicode charecters are some Chinese charecters...AM/PM I guess //Locale = new Locale();//don't set the language yet SimpleDateFormat df = (SimpleDateFormat)DateFormat.getDateTimeInstance(DateFormat.DEFAULT,DateFormat.DEFAULT); Date instTime = null; try { instTime = df.parse(installTime); } catch (ParseException e) { // TODO Auto-generated catch block e.printStackTrace(); } System.out.println(instTime.toString()); } } The output I get is Parsing Failed java.text.ParseException: Unparseable date: "2009/11/26 \u4e0b\u5348 04:40:54" at java.text.DateFormat.parse(Unknown Source) at ts.main(ts.java:39) Exception in thread "main" java.lang.NullPointerException at ts.main(ts.java:45) It throws exception and at the end when I print it, it shows some proper date... wrong though. I would really appreciate if you could clarify me on these doubts How to parse timestamps that have unicode characters if this is not the proper way? If parsing is failed, how could instTime able to hold some date, wrong though? I know its some chinese,Korean time stamps so I set the locale to zh and ko as follows.. even then same error comes again Locale = new Locale("ko"); Locale = new Locale("ja"); Locale = new Locale("zh"); How can I do the same thing in Perl? I can't use Date::Manip package; Is there any other way?

    Read the article

  • Real time location tracking - windows program or browser based?

    - by mawg
    I want to track a few hundred, maybe a few thousand people in real time. Let's say that the hardware aspects are sorted out and I can get the data into a database. Now, I want to get it out and show it, in real-time. Weeeell ... "real-enough" time. Let's say that I want to draw a floorplan of a building and plot everyone every 1 to 5 seconds. (I might want to show only certain "kinds" of people at the click of a button; I will need datamining, etc, but let's stick with the worse case scenario). I am comfortable enough with PHP, though not this sort of thing. I personally would be happier with a windows app coded in Delphi, but the trend seems to be to make everything browser based. So, the question, I guess is whether a browser can handle this and whether there are compelling arguments for a windows-based or browser-based solution. If browser-based can handle this (displaying a few thousand data-points a second), and there are no overwhelming arguments for windows then I guess I will go for browser-based and learn a few new tricks. The obvious advantage being that I could also re-use a large part of my code for (vehicle) tracking on Google maps.

    Read the article

  • Creating serializeable unique compile-time identifiers for arbitrary UDT's.

    - by Endiannes
    I would like a generic way to create unique compile-time identifiers for any C++ user defined types. for example: unique_id<my_type>::value == 0 // true unique_id<other_type>::value == 1 // true I've managed to implement something like this using preprocessor meta programming, the problem is, serialization is not consistent. For instance if the class template unique_id is instantiated with other_type first, then any serialization in previous revisions of my program will be invalidated. I've searched for solutions to this problem, and found several ways to implement this with non-consistent serialization if the unique values are compile-time constants. If RTTI or similar methods, like boost::sp_typeinfo are used, then the unique values are obviously not compile-time constants and extra overhead is present. An ad-hoc solution to this problem would be, instantiating all of the unique_id's in a separate header in the correct order, but this causes additional maintenance and boilerplate code, which is not different than using an enum unique_id{my_type, other_type};. A good solution to this problem would be using user-defined literals, unfortunately, as far as I know, no compiler supports them at this moment. The syntax would be 'my_type'_id; 'other_type'_id; with udl's. I'm hoping somebody knows a trick that allows implementing serialize-able unique identifiers in C++ with the current standard (C++03/C++0x), I would be happy if it works with the latest stable MSVC and GNU-G++ compilers, although I expect if there is a solution, it's not portable.

    Read the article

  • Web pages that a long time to load keep on reloading, just on vista on my work n/w...

    - by Ralpharama
    I have a curious problem at work which I've been struggling with since the advent of Windows Vista. We send our own email newsletter out to 40,000+ people once a week. The sending code has been in place for years, it's in classic ASP/VBscript called through a browser and simply loops through each email address, sending it to them. The page takes 40 mins or more to run, so has a big timeout value to allow it to do so. All well and good, suddenly, after Windows Vista is installed on the work PCs, the email sending page behaved oddly - after a period of time it seems to reload the page, endlessly, so the first 20% of our users get multiple copies of the newsletter until we kill the process! If we run the code on an XP machine in the on the same office network, it works fine. If we run it on Vista outside the office, so, say, on my own ISP, then it also works fine! Note, same effect in IE and FF... So, something about my office network and Vista is causing this... I recently re-wrote the newsletter code so it would split the task into chunks of 100 users at a time, hoping this would fix it, but my most recent test shows that the office n/w vista machine once again reloads the same page over any over, even though it takes 1/10th of the time to run... Does anyone have any ideas what it might be, how I can prove it, or, better, how I can get round it? Thanks for your advice :)

    Read the article

  • If I write a framework that gets information from the Internet, should I make a degelate or use blocks?

    - by Time Machine
    Say I'm writing a publicly available framework for the Vimeo API. This framework needs to get information from the Internet. Because this can take some time, I need to use threadin to prevent the UI from hanging. Foundation uses delegates for this, like NSURLConnectionDelegate. However, Game Kit uses blocks as callback functions. What is the recommended way of doing this? I know blocks aren't supported in standard GCC versions, but they require less, much less code for the one that uses my framework. Delegates, on the other hand, are real methods and when protocols are used, I'm sure the methods are implemented. Thanks.

    Read the article

  • How can I resolve "A clock interrupt was not received on a secondary processor within the allocated time interval."?

    - by dimas bimayakti
    I've been troubled by my laptop problem lately. It always blue screen error every time I use it. Every time it becomes blue screen it shows: A clock interrupt was not received on a secondary processor within the allocated time interval. I don't understand what it means, and I even bought a new HDD and RAM, I also haven't install new programs but the blue screen always appears. Is there anyone who has same problem with me and knows how to solve this problem

    Read the article

  • Why so Long time span in creating Session Factory?

    - by vijay.shad
    Hi My project is web application running in the tomcat container. This application is a spring framework based hibernate application. The problem with this is it takes a lot of time when creates session factory. here is the logs 2010-04-15 23:05:28,053 DEBUG [SessionFactoryImpl] Session factory constructed with filter configurations : {} 2010-04-15 23:05:28,053 DEBUG [SessionFactoryImpl] instantiating session factory with properties: {java.vendor=Sun Microsystems Inc., sun.java.launcher=SUN_STANDARD, catalina.base=/usr/local/InstalledPrograms/apache-tomcat-6.0.20, sun.management.compiler=HotSpot Tiered Compilers, catalina.useNaming=true, os.name=Linux, sun.boot.class.path=/usr/java/jdk1.6.0_17/jre/lib/resources.jar:/usr/java/jdk1.6.0_17/jre/lib/rt.jar:/usr/java/jdk1.6.0_17/jre/lib/sunrsasign.jar:/usr/java/jdk1.6.0_17/jre/lib/jsse.jar:/usr/java/jdk1.6.0_17/jre/lib/jce.jar:/usr/java/jdk1.6.0_17/jre/lib/charsets.jar:/usr/java/jdk1.6.0_17/jre/classes, java.util.logging.config.file=/usr/local/InstalledPrograms/apache-tomcat-6.0.20/conf/logging.properties, java.vm.specification.vendor=Sun Microsystems Inc., hibernate.generate_statistics=true, java.runtime.version=1.6.0_17-b04, hibernate.cache.provider_class=org.hibernate.cache.EhCacheProvider, user.name=root, shared.loader=, tomcat.util.buf.StringCache.byte.enabled=true, hibernate.connection.release_mode=auto, user.language=en, java.naming.factory.initial=org.apache.naming.java.javaURLContextFactory, sun.boot.library.path=/usr/java/jdk1.6.0_17/jre/lib/i386, java.version=1.6.0_17, java.util.logging.manager=org.apache.juli.ClassLoaderLogManager, user.timezone=Canada/Pacific, sun.arch.data.model=32, java.endorsed.dirs=/usr/local/InstalledPrograms/apache-tomcat-6.0.20/endorsed, sun.cpu.isalist=, sun.jnu.encoding=UTF-8, file.encoding.pkg=sun.io, package.access=sun.,org.apache.catalina.,org.apache.coyote.,org.apache.tomcat.,org.apache.jasper.,sun.beans., file.separator=/, java.specification.name=Java Platform API Specification, java.class.version=50.0, user.country=US, java.home=/usr/java/jdk1.6.0_17/jre, java.vm.info=mixed mode, os.version=2.6.18-128.el5, path.separator=:, java.vm.version=14.3-b01, hibernate.jdbc.batch_size=25, java.awt.printerjob=sun.print.PSPrinterJob, sun.io.unicode.encoding=UnicodeLittle, package.definition=sun.,java.,org.apache.catalina.,org.apache.coyote.,org.apache.tomcat.,org.apache.jasper., java.naming.factory.url.pkgs=org.apache.naming, sun.rmi.dgc.client.gcInterval=3600000, user.home=/root, java.specification.vendor=Sun Microsystems Inc., java.library.path=/usr/java/jdk1.6.0_17/jre/lib/i386/server:/usr/java/jdk1.6.0_17/jre/lib/i386:/usr/java/jdk1.6.0_17/jre/../lib/i386:/usr/java/packages/lib/i386:/lib:/usr/lib, java.vendor.url=http://java.sun.com/, java.vm.vendor=Sun Microsystems Inc., hibernate.dialect=org.hibernate.dialect.MySQL5InnoDBDialect, sun.rmi.dgc.server.gcInterval=3600000, common.loader=${catalina.home}/lib,${catalina.home}/lib/*.jar, java.runtime.name=Java(TM) SE Runtime Environment, java.class.path=:/usr/local/InstalledPrograms/apache-tomcat-6.0.20/bin/bootstrap.jar, hibernate.bytecode.use_reflection_optimizer=false, java.vm.specification.name=Java Virtual Machine Specification, java.vm.specification.version=1.0, catalina.home=/usr/local/InstalledPrograms/apache-tomcat-6.0.20, sun.cpu.endian=little, sun.os.patch.level=unknown, hibernate.cache.use_query_cache=true, hibernate.connection.provider_class=org.springframework.orm.hibernate3.LocalDataSourceConnectionProvider, java.io.tmpdir=/usr/local/InstalledPrograms/apache-tomcat-6.0.20/temp, java.vendor.url.bug=http://java.sun.com/cgi-bin/bugreport.cgi, server.loader=, os.arch=i386, java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment, java.ext.dirs=/usr/java/jdk1.6.0_17/jre/lib/ext:/usr/java/packages/lib/ext, user.dir=/, line.separator=, java.vm.name=Java HotSpot(TM) Server VM, hibernate.cache.use_second_level_cache=true, file.encoding=UTF-8, java.specification.version=1.6, hibernate.show_sql=true} 2010-04-15 23:08:53,516 DEBUG [AbstractEntityPersister] Static SQL for entity: com.vsd.model.Order There you can see the time delay of more than 3 mins in executing these processes. My database is mysql and database server is running on the local machine only. The container environment is Centos Linux system. I am clueless about why it takes that much of time in executing these process, But when i do the same task from under eclipse it does not take that much of time. Development environment is Windows.

    Read the article

  • mysqld_safe Can't log to error log and syslog at the same time. Remove all --log-error configuration options for --syslog to take effect

    - by photon
    When I'm trying to install MySQL 5.5 community edition on my Ubuntu 10.04 by compiling the source code, I met the following problem: $ fg % 1 sudo ../bin/mysqld_safe --basedir=/usr/local/mysql_community_5.5/data --user=mysql --defaults-file=/etc/my.cnf [sudo] password for linnan: Sorry, try again. [sudo] password for linnan: 121023 09:26:21 mysqld_safe Can't log to error log and syslog at the same time. Remove all --log-error configuration options for --syslog to take effect. Internal program error (non-fatal): unknown logging method '/usr/local/mysql_community_5.5/log/mysql.log' 121023 09:26:21 mysqld_safe Logging to '/var/log/mysql/error.log'. Internal program error (non-fatal): unknown logging method '/usr/local/mysql_community_5.5/log/mysql.log' 121023 09:26:22 mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql 121023 09:26:23 mysqld_safe mysqld from pid file /var/lib/mysql/ubuntu.pid ended It seems that the problem is related to log configuration. I've noticed a bugfix related to this problem: http://bugs.mysql.com/bug.php?id=50083 But I still have no idea how to solve it. The relative content in /etc/my.cnf: [mysqld] port = 3306 socket = /tmp/mysql.sock skip-external-locking key_buffer_size = 384M max_allowed_packet = 1M table_open_cache = 512 sort_buffer_size = 2M read_buffer_size = 2M read_rnd_buffer_size = 8M myisam_sort_buffer_size = 64M thread_cache_size = 8 query_cache_size = 32M # Try number of CPU's*2 for thread_concurrency thread_concurrency = 8 character-set-server=utf8 [mysqld-safe] basedir=/usr/local/mysql_community_5.5 datadir=/usr/local/mysql_community_5.5/data mysqld_safe_syslog.cnf: /etc/mysql/conf.d/mysqld_safe_syslog.cnf: [mysqld_safe] syslog

    Read the article

  • How can I make Cygwin open a new window each time I use a Window 7 keyboard shortcut?

    - by Michael Gundlach
    [Update: The short answer is, if an application in the 3rd thing on your taskbar, press WindowsKey+Shift+3 to open a new instance. Hooray!] I have Chrome and Cygwin on my taskbar. Chrome's shortcut is Ctrl-Alt-C (as set through right clicking the icon and putting Ctrl-Alt-C into Chrome - Properties - Shortcut Key). Cygwin's shortcut is Ctrl-Alt-T. When I press Ctrl-Alt-C, I get a new Chrome window. Great! It's as if I had shift-clicked on the Chrome icon. When I press Ctrl-Alt-T, I get a Cygwin window the first time, but after that I just get focused on the Cygwin window. As if I had simply clicked on the Cygwin icon and not shift-clicked. As if Cygwin wasn't able to have more than one instance open. I can still shift-click the icon to get more instances. I've tried with different keys than Ctrl-Alt-T and got the same behavior. Strangely, I've twice managed to get it into a state (via just clearing and setting the shortcut key over and over) where a shortcut WOULD open multiple instances -- but it was Ctrl-Alt-G both times, which doesn't make sense to my brain which has been trained to use Ctrl-Alt-T for years. Ctrl-Alt-G usually behaves just as poorly as Ctrl-Alt-T, except for the two times when magically it started behaving properly. So I'm thinking this is a Windows 7 bug (which has existed since Windows XP at least), but I'm hoping someone knows something I don't :)

    Read the article

  • SQL Server 2008 second instance times out when logging in -- but only the first time?

    - by Kromey
    This is a strange one that has plagued me for a while now. When logging in to the second instance of SQL Server 2008 on one of our database servers, we get a timeout error: Cannot connect to servername\mssqlserver2. Additional information: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. (Microsoft SQL Server)") (This is the error message when trying to connect with Microsoft SQL Server Management Studio; other tools experience the same error, but of course say it in different ways.) Immediately re-attempting to log in works just fine, so whatever the cause it's ephemeral! This happens regardless of user or authentication type (both Windows and SQL Server authentication methods are supported on this instance). What's even weirder, though, is that the first instance on this server has never once demonstrated this problem. Server is a Windows Server 2008 R2 virtual server, hosted in Microsoft Hyper-V (host is likewise Server 2008 R2). The server has 2GB of RAM, and seems to regularly be using 90% of that -- could low memory be the cause of this issue? I could see this second instance -- which is not used very often yet -- being swapped out to disk, and then taking too long to load back into memory to respond in time to the connection request, but I'd rather have more than just my own hunch before I go scheduling a downtime for this server (the first instance is used regularly) and then just throwing extra resources at it in the blind hope that the problem goes away.

    Read the article

  • Why do my Xcode default font starts to look ugly after some time, until I restart?

    - by mystify
    I plugged in an external monitor. All resolutions match perfectly. MacBookPro LCD is closed. After restarting Xcode editor fonts look very bad. Only in Xcode. When I restart the mac and DON'T use an external monitor, fonts look all right again. When I attach the monitor and close the LCD of the MacBookPro, fonts look nice. Then I close Xcode and reopen it: Fonts suck. Only way to get fonts look good is to disconnect external monitor and reboot, then reconnect external monitor, close LCD, wait, hit any key and let the external monitor be the only one. Fonts look nice - until I restart Xcode. I think it happens any time Xcode is launched with external monitor attached and ugly fonts survive until reboot. Unplugging external monitor and restarting Xcode doesn't help. It seems like Xcode isn't antialiasing them properly after something happens. Is there a fix for this problem? EDIT: After trying a few more times, it seems it is possible to get fonts to look nice by disconnecting external monitor and reopening xcode. Here are some little snapshots: GOOD FONT: UGLY FONT: You can see how dirty the ugly font looks. It's very hard to read and hurts in the eyes. Believe me. It sucks. Sometimes the little "i" are almost invisible. I make use of the very eye-friendly Dusk style of Xcode (go to preferences and choose that, if you haven't already. A real pleasure for your eyes)

    Read the article

  • Is there a program to show programs loading during the boot process in real time?

    - by Gary M. Mugford
    Hi all, There are any number of programs that will show me WHAT will run during the boot process for Windows XP. I've always been partial to Mike Lin's version but there are several others, some of which are quite possibly superior. That's not the issue. What I'd really like is a program that would load first and then would list the programs that were about to load and then check them off as the programs loaded. This isn't something I necessarily need for myself. But certain family members get click happy as soon as they see the icon they eventually want to run and end up clicking on it. THIRTY TWO TIMES in one memorable crash-inducing spasm. If there was some way for 'progress' to be shown during the loading of from the various spots Windows auto-loads from, PLUS a BIG BANNER saying "Please do not move the mouse or click on anything until done.", I think I might cut down on my early morning family support calls significantly. I've tried a variety of searches, but I couldn't find the ones that show in real time in the forest of links to programs that will show the list after the fact. Any leads? If not, do any of you who write the after-the-fact listers want to take a shot at producing a utility to do what I think would be a relatively popular utility? Best of the season to all of you and yours. Thanks in advance for any replies, GM

    Read the article

  • Why does my DSL modem now need a reboot each time for my laptop to connect?

    - by msorens
    I have a rather peculiar home networking issue. For sometime my home network was purring along fine. I could turn on either of my laptops and they would quickly find and connect to my DSL modem (and thence the internet). Several days ago I unplugged my DSL modem for the first time in months. Upon turning it back on and waiting for the boot to finish, the lights on the panel indicated the DSL modem was fully operational, just as before. But that's not what happened. Not at all. Now when I turn on my Win7 laptop, the network icon in my system tray shows a small starburst; hovering over it the tooltip states "Not connected; connections are available". Clicking it lists several nearby networks including my own network showing a strong signal. If I click to connect, it attempts a connection but then I get a dialog stating "Windows was unable to connect to MyNet.". Turning off wireless on my laptop and turning it back on yields no difference. Running the network troubleshooter (which includes doing a repair on the network connection) yields no difference. The only remedy is to reboot the DSL modem (i.e. unplug it, wait a few seconds, then plug it back in). As soon as it goes online my laptop finds it and connects properly. To add one more twist to the story, this happened to me once before, several months ago. After a couple weeks, the situation resolved itself(!). Everything started working properly again, due to nothing I did. Final note: this problem only affects the wireless connection to the DSL modem. My desktop computer, connected via hardline to the DSL modem, connects fine when I turn it on. Any thoughts on why this is happening or how to fix it?

    Read the article

  • Inactive users in windows server after some time according to first login instead of defining a solid expiration date

    - by smhnaji
    We want to give access to some Windows Server users so they can remotely have access to our server and download from a special folder of the server. The licenses we give to users, are time base. There should be 1 month, 2 month, ..., 1 year, ... licenses. CURRENT SITUATION (WHAT I DON'T WANT): When users are created and added to the OS, a solid expiration date is given. WHAT I WANT: Users' expiration date should be calculated automatically after first login. The user might not need his account right when purchases the license. In another words: When a license of the user we create is purchased at Jan 1st, he should use the license until Feb 1st. No matter whether he really logs in or not. He cannot come Feb 5th and begin using his license because that has expired then. What I want is that when he comes at Feb 5th and begins using, the license update until March 5th. CLARIFICATION (Update after MDMarra's comment) Working environment is Windows Server 2012. By the word 'user', I mean Native Windows Server Users. Whenever a new person purchases a license with me, I create them manually using net user command like this: net user ali pass /add /expires:2013-12-25

    Read the article

< Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >