Search Results

Search found 21349 results on 854 pages for 'thought space designs'.

Page 198/854 | < Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >

  • POV Christmas Tree Is a Holiday-Themed DIY Electronics Project

    - by Jason Fitzpatrick
    If you’re looking for an electronics project with a bit of holiday cheer, this clever POV Christmas tree combines LEDs, motors, and a simple vision hack to create a glowing Christmas tree. POV (or Persistence Of Vision) hacks rely on your visual circuit’s lag time. By taking advantage of that lag POV displays can create the illusion of shapes and words where there are none. In the case of this Christmas tree hack a spinning set of LED lights creates the illusion of a Christmas tree when, in reality, there is just a few LEDs suspended in space by wire. It’s not a beginner level project by any means but it is a great way to practice surface mounting electronics and polish up your PCB making skills. Hit up the link below for the full tutorial. POV Christmas Tree [Instructables] HTG Explains: Do You Really Need to Defrag Your PC? Use Amazon’s Barcode Scanner to Easily Buy Anything from Your Phone How To Migrate Windows 7 to a Solid State Drive

    Read the article

  • The way I think about Diagnostic tools

    - by Daniel Moth
    Every software has issues, or as we like to call them "bugs". That is not a discussion point, just a mere fact. It follows that an important skill for developers is to be able to diagnose issues in their code. Of course we need to advance our tools and techniques so we can prevent bugs getting into the code (e.g. unit testing), but beyond designing great software, diagnosing bugs is an equally important skill. To diagnose issues, the most important assets are good techniques, skill, experience, and maybe talent. What also helps is having good diagnostic tools and what helps further is knowing all the features that they offer and how to use them. The following classification is how I like to think of diagnostics. Note that like with any attempt to bucketize anything, you run into overlapping areas and blurry lines. Nevertheless, I will continue sharing my generalizations ;-) It is important to identify at the outset if you are dealing with a performance or a correctness issue. If you have a performance issue, use a profiler. I hear people saying "I am using the debugger to debug a performance issue", and that is fine, but do know that a dedicated profiler is the tool for that job. Just because you don't need them all the time and typically they cost more plus you are not as familiar with them as you are with the debugger, doesn't mean you shouldn't invest in one and instead try to exclusively use the wrong tool for the job. Visual Studio has a profiler and a concurrency visualizer (for profiling multi-threaded apps). If you have a correctness issue, then you have several options - that's next :-) This is how I think of identifying a correctness issue Do you want a tool to find the issue for you at design time? The compiler is such a tool - it gives you an exact list of errors. Compilers now also offer warnings, which is their way of saying "this may be an error, but I am not smart enough to know for sure". There are also static analysis tools, which go a step further than the compiler in identifying issues in your code, sometimes with the aid of code annotations and other times just by pointing them at your raw source. An example is FxCop and much more in Visual Studio 11 Code Analysis. Do you want a tool to find the issue for you with code execution? Just like static tools, there are also dynamic analysis tools that instead of statically analyzing your code, they analyze what your code does dynamically at runtime. Whether you have to setup some unit tests to invoke your code at runtime, or have to manually run your app (and interact with it) under the tool, or have to use a script to execute your binary under the tool… that varies. The result is still a list of issues for you to address after the analysis is complete or a pause of the execution when the first issue is encountered. If a code path was not taken, no analysis for it will exist, obviously. An example is the GPU Race detection tool that I'll be talking about on the C++ AMP team blog. Another example is the MSR concurrency CHESS tool. Do you want you to find the issue at design time using a tool? Perform a code walkthrough on your own or with colleagues. There are code review tools that go beyond just diffing sources, and they help you with that aspect too. For example, there is a new one in Visual Studio 11 and searching with my favorite search engine yielded this article based on the Developer Preview. Do you want you to find the issue with code execution? Use a debugger - let’s break this down further next. This is how I think of debugging: There is post mortem debugging. That means your code has executed and you did something in order to examine what happened during its execution. This can vary from manual printf and other tracing statements to trace events (e.g. ETW) to taking dumps. In all cases, you are left with some artifact that you examine after the fact (after code execution) to discern what took place hoping it will help you find the bug. Learn how to debug dump files in Visual Studio. There is live debugging. I will elaborate on this in a separate post, but this is where you inspect the state of your program during its execution, and try to find what the problem is. More from me in a separate post on live debugging. There is a hybrid of live plus post-mortem debugging. This is for example what tools like IntelliTrace offer. If you are a tools vendor interested in the diagnostics space, it helps to understand where in the above classification your tool excels, where its primary strength is, so you can market it as such. Then it helps to see which of the other areas above your tool touches on, and how you can make it even better there. Finally, see what areas your tool doesn't help at all with, and evaluate whether it should or continue to stay clear. Even though the classification helps us think about this space, the reality is that the best tools are either extremely excellent in only one of this areas, or more often very good across a number of them. Another approach is to offer a toolset covering all areas, with appropriate integration and hand off points from one to the other. Anyway, with that brain dump out of the way, in follow-up posts I will dive into live debugging, and specifically live debugging in Visual Studio - stay tuned if that interests you. Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Lego Sport Champions: Soccer – An 80s Style LEGO Video

    - by Asian Angel
    Are you a fan of LEGO and soccer? Then watch as these two teams use some fancy LEGO footwork to try and win the championship game in this nicely done retro-look video. Wait!! Is that player building a brick wall?? Lego Sport Champions: Soccer [YouTube] Latest Features How-To Geek ETC Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 Lego Sport Champions: Soccer – An 80s Style LEGO Video Access the Options for Your Favorite Extensions Easier in Firefox Don’t Sleep Keeps Your Windows Machine Awake DropSpace Syncs Android Files to Dropbox Field of Poppies Wallpaper The History Of Operating Systems [Infographic]

    Read the article

  • Modelling photo-realistic grass in realtime

    - by sebf
    Hello, I see a number of tutorials on how to create good looking grasses when creating 3D renders but can't think how to model it for realtime/use in a game's scenery. Sure simple models with alpha cutouts can be used to create plants and trees in really awesome scenery but what about a lawn? Are there any good tricks to achieve this effect? I tried with a simple 4 sided box and a small texture and the number of objects needed for a decent appearance made Max crawl to a halt. (I am thinking it may be possible with a shader but that is a whole other area so thought I would just ask about anyones experience with modelling it here) Thanks!

    Read the article

  • Pyglet vs. PyQt

    - by L. De Leo
    I need to implement a really simple card game. As the game logic is written in Python I chose to stick with some Python framework even if my goal is to develop a Windows only version. I also don't like to work with .NET so I ruled out Iron Python + WPF. I tried to write a simple prototype with Pyglet but soon discovered that I will have to do a lot of stuff by hand: things like detecting mouseover events and finding which card was clicked on and moving it, etc... very low level and unnecessary for my use-case. So I thought it might be easier to do things in PyQt. Do you reckon it would be feasible to use PyQt for implementing a simple card game? Will I have higher level events I can work with?

    Read the article

  • Sprite Kit - containsPoint for SKPhysicsBody?

    - by gj15987
    I have a ball bouncing around the screen. I can pick it up and drag it onto a "bucket". When my touches finish, I use the containsPoint function to check and see if I have dropped the ball onto the bucket. This works fine, however, I actually want to check whether the ball is dropped onto the bucket node's physics body because my "bucket" is actually just an oval, and so I've applied a physics body which is the same shape as the oval, so that the white space around the oval isn't included in the physics simulation. I can't seem to find a "containsPoint" function for physics bodies. Can anyone advise on how I'd check for this? To summarise, I want to drop a node, onto a specific part of another node (or its physics body) and trigger an event. Thanks in advance.

    Read the article

  • Ubuntu 11 and 12 initially fast but later bogs down, CPU pegged

    - by uos??
    I started with Ubuntu 11 a few weeks ago. It's on a DELL M4300 with a OCZ SSD. Default setup, except that I've installed the proprietary NVIDIA graphics and BROADCOM wireless drivers. Dual boot with Windows. If I cold boot into Ubuntu, it is very fast, just like the Windows experience that I'm used to. But SOMETHING happens, and I haven't yet determined what, but the system gets incredibly slow and stays that way. At first I thought it had to do with Adobe Flash because it seemed to be triggered by sites with Flash. But then I removed Flash and the problem remains. I thought it was just an overheating problem, but I've now upgraded to 12.04 which supposedly fixes the overheating problems I've read about. Perhaps the heat situation was brought on by Flash in my early cases? So I installed Jupiter for CPU management, but the thermometer reports a familiar Windows-side temperature of 53 degrees Celsius. Switching Jupiter to lower performance doesn't help. When I check the System Monitor application, sorting by CPU usage, there are no obvious problem processes. However, in the graphs tab, both CPU cores are pegged at 100%! I notice that the slowness seems to be similar to the extremely bad performance I got prior to installing the NVIDIA drivers. I'm not sure if that helps. This is the strangest part to me - although the temperature seems OK, even after rebooting, the system remains slow - starting with GRUB2 which is very noticeably delayed, all the way through to either Ubuntu or Windows! That's right, even the Windows side suffers effects and takes several minutes to complete booting whereas normally (with my SSD) it's ready to use in 15 seconds. The only way to fix it is to shutdown and let the parts cool down. Or maybe it just needs to completely power off and boot rather than a soft reboot, temperature has nothing to do with it? - is that possible? But know that I have never had this problem in Windows, even if Windows gets very hot (135 F) a reboot would be enough time for it to recover. For this reason, I don't think it's a heat thing, but I can't imagine what else could be surviving the reboot. I'm entirely updated - there are no pending updates. I have the Post-Release updates of NVIDIA too, btw. If this sounds CLOSE to something you know about, but one of the details doesn't line up exactly, it might be a mistake in my perception. Are there tests you can suggest to rule something out? Thanks! processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 23 model name : Intel(R) Core(TM)2 Duo CPU T9500 @ 2.60GHz stepping : 6 microcode : 0x60c cpu MHz : 800.000 cache size : 6144 KB physical id : 0 siblings : 2 core id : 0 cpu cores : 2 apicid : 0 initial apicid : 0 fpu : yes fpu_exception : yes cpuid level : 10 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx lm constant_tsc arch_perfmon pebs bts rep_good nopl aperfmperf pni dtes64 monitor ds_cpl vmx est tm2 ssse3 cx16 xtpr pdcm sse4_1 lahf_lm ida dts tpr_shadow vnmi flexpriority bogomips : 5187.00 clflush size : 64 cache_alignment : 64 address sizes : 36 bits physical, 48 bits virtual power management: processor : 1 vendor_id : GenuineIntel cpu family : 6 model : 23 model name : Intel(R) Core(TM)2 Duo CPU T9500 @ 2.60GHz stepping : 6 microcode : 0x60c cpu MHz : 800.000 cache size : 6144 KB physical id : 0 siblings : 2 core id : 1 cpu cores : 2 apicid : 1 initial apicid : 1 fpu : yes fpu_exception : yes cpuid level : 10 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx lm constant_tsc arch_perfmon pebs bts rep_good nopl aperfmperf pni dtes64 monitor ds_cpl vmx est tm2 ssse3 cx16 xtpr pdcm sse4_1 lahf_lm ida dts tpr_shadow vnmi flexpriority bogomips : 5186.94 clflush size : 64 cache_alignment : 64 address sizes : 36 bits physical, 48 bits virtual power management: (Redundant figures removed. You can view them in the edits if they are still relevant) ps: %CPU PID USER COMMAND 9.4 2399 jason gnome-terminal 6.2 2408 jason bash 17.3 1117 root /usr/bin/X :0 -auth /var/run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch -background none 13.7 1667 jason compiz 1.3 1960 jason /usr/lib/unity/unity-panel-service 1.3 1697 jason python /usr/bin/jupiter 0.9 1964 jason /usr/lib/indicator-appmenu/hud-service 0.6 1689 jason nautilus -n 0.4 1458 jason //bin/dbus-daemon --fork --print-pid 5 --print-address 7 --session I should highlight specifically that GRUB2 can also be very slow. I don't know the relationship of which scenarios GRUB2 is also slow, but WHEN it is slow, it is slow both before the menu appears and after the selection is made - although for the diagnosis of GRUB2 it is harder for me to tell what the normal speeds should be. With SSD, I would expect that GRUB2 could load instantly, and that the GRUB2 purple would disappear instantly after the selection. The only delay to be expected is the change in graphics modes (though I couldn't guess why that ever requires any noticeable time)

    Read the article

  • jQuery Selector Tester and Cheat Sheet

    - by SGWellens
    I've always appreciated these tools: Expresso and XPath Builder. They make designing regular expressions and XPath selectors almost fun! Did I say fun? I meant less painful. Being able to paste/load text and then interactively play with the search criteria is infinitely better than the code/compile/run/test cycle. It's faster and you get a much better feel for how the expressions work. So, I decided to make my own interactive tool to test jQuery selectors:  jQuery Selector Tester.   Here's a sneak peek: Note: There are some existing tools you may like better: http://www.woods.iki.fi/interactive-jquery-tester.html http://www.w3schools.com/jquery/trysel.asp?filename=trysel_basic&jqsel=p.intro,%23choose My tool is different: It is one page. You can save it and run it locally without a Web Server. It shows the results as a list of iterated objects instead of highlighted html. A cheat sheet is on the same page as the tester which is handy. I couldn't upload an .htm or .html file to this site so I hosted it on my personal site here: jQuery Selector Tester. Design Highlights: To make the interactive search work, I added a hidden div to the page: <!--Hidden div holds DOM elements for jQuery to search--><div id="HiddenDiv" style="display: none"></div> When ready to search, the searchable html text is copied into the hidden div…this renders the DOM tree in the hidden div: // get the html to search, insert it to the hidden divvar Html = $("#TextAreaHTML").val();$("#HiddenDiv").html(Html); When doing a search, I modify the search pattern to look only in the HiddenDiv. To do that, I put a space between the patterns.  The space is the Ancestor operator (see the Cheat Sheet): // modify search string to only search in our// hidden div and do the searchvar SearchString = "#HiddenDiv " + SearchPattern;try{    var $FoundItems = $(SearchString);}   Big Fat Stinking Faux Pas: I was about to publish this article when I made a big mistake: I tested the tool with Mozilla FireFox. It blowed up…it blowed up real good. In the past I’ve only had to target IE so this was quite a revelation. When I started to learn JavaScript, I was disgusted to see all the browser dependent code. Who wants to spend their time testing against different browsers and versions of browsers? Adding a bunch of ‘if-else’ code is a tedious and thankless task. I avoided client code as much as I could. Then jQuery came along and all was good. It was browser independent and freed us from the tedium of worrying about version N of the Acme browser. Right? Wrong! I had used outerHTML to display the selected elements. The problem is Mozilla FireFox doesn’t implement outerHTML. I replaced this: // encode the html markupvar OuterHtml = $('<div/>').text(this.outerHTML).html(); With this: // encode the html markupvar Html = $('<div>').append(this).html();var OuterHtml = $('<div/>').text(Html).html(); Another problem was that Mozilla FireFox doesn’t implement srcElement. I replaced this: var Row = e.srcElement.parentNode;  With this: var Row = e.target.parentNode; Another problem was the indexing. The browsers have different ways of indexing. I replaced this: // this cell has the search pattern  var Cell = Row.childNodes[1];   // put the pattern in the search box and search                    $("#TextSearchPattern").val(Cell.innerText);  With this: // get the correct cell and the text in the cell// place the text in the seach box and serachvar Cell = $(Row).find("TD:nth-child(2)");var CellText = Cell.text();$("#TextSearchPattern").val(CellText);   So much for the myth of browser independence. Was I overly optimistic and gullible? I don’t think so. And when I get my millions from the deposed Nigerian prince I sent money to, you’ll see that having faith is not futile. Notes: My goal was to have a single standalone file. I tried to keep the features and CSS to a minimum–adding only enough to make it useful and visually pleasing. When testing, I often thought there was a problem with the jQuery selector. Invariable it was invalid html code. If your results aren't what you expect, don't assume it's the jQuery selector pattern: The html may be invalid. To help in development and testing, I added a double-click handler to the rows in the Cheat Sheet table. If you double-click a row, the search pattern is put in the search box, a search is performed and the page is scrolled so you can see the results. I left the test html and code in the page. If you are using a CDN (non-local) version of the jQuery libraray, the designer in Visual Studio becomes extremely slow.  That's why there are two version of the library in the header and one is commented out. For reference, here is the jQuery documentation on selectors: http://api.jquery.com/category/selectors/ Here is a much more comprehensive list of CSS selectors (which jQuery uses): http://www.w3.org/TR/CSS2/selector.html I hope someone finds this useful. Steve WellensCodeProject

    Read the article

  • Local Events | Azure Bootcamp

    - by Jeff Julian
    Coming to Kansas City April 8th and 9th is the Microsoft Azure Bootcamp. This event looks very promising for those developers who are looking into Azure for themselves or their companies. It covers the wide range of topics required to understand what Azure really is and is not. Space is limited so if you are considering Azure, register for this event today.Agenda:Module 1: Introduction to cloud computer and AzureHow it worksKey ScenariosThe development environment and SDKModule 2: Using Web RolesBasic ASP.NETBasic configurationModule 3: Blobs: File Storage in the cloudModule 4: Tables: Scalable hierarchical storageModule 5: Queues: Decoupling your systemsModule 6: Basic Worker RolesExecuting backend processesConsuming a queueLeveraging local storageModule 7: Advanced Worker RolesExternal EndpointsInter-role communicationModule 8: Building a business with AzureUsing Azure as an ISV or a partnerAdvantages to delivering valueBPOSPricingModule 9: SQL AzureSetting it upSQL Azure firewallRemote managementMigrating dataModule 10: AppFabricService BusAccess Control SystemIdentity in the cloudModule 11: Cloud ScenariosApp migration strategiesDisposable computingDynamic scaleShuntingPrototypingMultitenant applications (This is my second attempt at this post after MacJournal decided to crash and not save my work. Authoring tools all need auto-save features by now, that is a requirement set in stone by Microsoft Word 97) Related Tags: Azure, Microsoft, Kansas City

    Read the article

  • https & ajax crawling

    - by Christoph Gassauer
    We made on our webpage https://www.1point618.com a transition to ssl and now we using nearly entirely ajax to load the content. Therefore all urls of existing pages have changed. We used the 301 redirect as recommended, also we have implemented google's specification that the webpage is still crawl-able. We thought that maybe it would last a month that we have the same ranking in google's search results, but still google's search results are much worse than before these changes. Most of the content (artist profiles) isn't indexed anymore. For example of the submitted sitemap only 3 of around 450 urls are indexed. Before almost all urls were indexed. My question is now: Does google's ajax crawling work together with ssl? (It looks like it would work, cause of the access log file.)

    Read the article

  • How one does qualify as a Web UI Developer?

    - by Duralumin
    I have about 20 years of experience with programming, most of that on the job, and right now, I define myself as a Web Developer, because I think about half my expertise lies in the all too extended "web" field, both server side and client side, and because in the last years I'm mostly doing web development. I know my javascript, jQuery, jQueryUI, HTML4-5, css2-3 and some frameworks like backbone.js and angularJS Since university I've always been interested in Man-Machine Interaction, UI and UX. Recently, I saw the label "Web UI Developer" tossed around, and I thought that would be something I would like to qualify for. And I'd really like to qualify with confidence. I didn't find any certificate or similar, and I don't think there are any. Is the only way to qualify as a Web UI Developer having a job as one? What are the skills I need to have, and the resources I can use to acquire them?

    Read the article

  • Nebula Filled Skies Above a City Wallpaper

    - by Asian Angel
    Note: To view and download other color variations of this wallpaper vist welshdragon’s gallery. Nebula Skies 5 [deviantART] Latest Features How-To Geek ETC How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) How To Remove People and Objects From Photographs In Photoshop Ask How-To Geek: How Can I Monitor My Bandwidth Usage? Internet Explorer 9 RC Now Available: Here’s the Most Interesting New Stuff Here’s a Super Simple Trick to Defeating Fake Anti-Virus Malware Comix is an Awesome Comics Archive Viewer for Linux Get the MakeUseOf eBook Guide to Speeding Up Windows for Free Need Tech Support? Call the Star Wars Help Desk! [Video Classic] Reclaim Vertical UI Space by Adding a Toolbar to the Left or Right Side of Firefox Androidify Turns You into an Android-style Avatar Reader for Android Updates; Now with Feed Widgets and More

    Read the article

  • Simplified INotifyPropertyChanged Implementation with WeakReference Support and Typed Property Acces

    - by Daniel Cazzulino
    I've grown a bit tired of implementing INotifyPropertyChanged. I've tried ways to improve it before (like this "ViewModel" custom tool which even generates strong-typed event accessors). But my fellow Clarius teammate Mariano thought it was overkill and didn't like that tool much. He mentioned an alternative approach also, which I didn't like too much because it relied on the consumer changing his typical interaction with the object events, but also because it has a substantial design flaw that causes handlers not to be called at all after a garbage collection happens. A very simple unit test will showcase this bug....Read full article

    Read the article

  • Analysing and measuring the performance of a .NET application (survey results)

    - by Laila
    Back in December last year, I asked myself: could it be that .NET developers think that you need three days and a PhD to do performance profiling on their code? What if developers are shunning profilers because they perceive them as too complex to use? If so, then what method do they use to measure and analyse the performance of their .NET applications? Do they even care about performance? So, a few weeks ago, I decided to get a 1-minute survey up and running in the hopes that some good, hard data would clear the matter up once and for all. I posted the survey on Simple Talk and got help from a few people to promote it. The survey consisted of 3 simple questions: Amazingly, 533 developers took the time to respond - which means I had enough data to get representative results! So before I go any further, I would like to thank all of you who contributed, because I now have some pretty good answers to the troubling questions I was asking myself. To thank you properly, I thought I would share some of the results with you. First of all, application performance is indeed important to most of you. In fact, performance is an intrinsic part of the development cycle for a good 40% of you, which is much higher than I had anticipated, I have to admit. (I know, "Have a little faith Laila!") When asked what tool you use to measure and analyse application performance, I found that nearly half of the respondents use logging statements, a third use performance counters, and 70% of respondents use a profiler of some sort (a 3rd party performance profilers, the CLR profiler or the Visual Studio profiler). The importance attributed to logging statements did surprise me a little. I am still not sure why somebody would go to the trouble of manually instrumenting code in order to measure its performance, instead of just using a profiler. I personally find the process of annotating code, calculating times from log files, and relating it all back to your source terrifyingly laborious. Not to mention that you then need to remember to turn it all off later! Even when you have logging in place throughout all your code anyway, you still have a fair amount of potentially error-prone calculation to sift through the results; in addition, you'll only get method-level rather than line-level timings, and you won't get timings from any framework or library methods you don't have source for. To top it all, we all know that bottlenecks are rarely where you would expect them to be, so you could be wasting time looking for a performance problem in the wrong place. On the other hand, profilers do all the work for you: they automatically collect the CPU and wall-clock timings, and present the results from method timing all the way down to individual lines of code. Maybe I'm missing a trick. I would love to know about the types of scenarios where you actively prefer to use logging statements. Finally, while a third of the respondents didn't have a strong opinion about code performance profilers, those who had an opinion thought that they were mainly complex to use and time consuming. Three respondents in particular summarised this perfectly: "sometimes, they are rather complex to use, adding an additional time-sink to the process of trying to resolve the existing problem". "they are simple to use, but the results are hard to understand" "Complex to find the more advanced things, easy to find some low hanging fruit". These results confirmed my suspicions: Profilers are seen to be designed for more advanced users who can use them effectively and make sense of the results. I found yet more interesting information when I started comparing samples of "developers for whom performance is an important part of the dev cycle", with those "to whom performance is only looked at in times of crisis", and "developers to whom performance is not important, as long as the app works". See the three graphs below. Sample of developers to whom performance is an important part of the dev cycle: Sample of developers to whom performance is important only in times of crisis: Sample of developers to whom performance is not important, as long as the app works: As you can see, there is a strong correlation between the usage of a profiler and the importance attributed to performance: indeed, the more important performance is to a development team, the more likely they are to use a profiler. In addition, developers to whom performance is an important part of the dev cycle have a higher tendency to use a much wider range of methods for performance measurement and analysis. And, unsurprisingly, the less important performance is, the less varied the methods of measurement are. So all in all, to come back to my random questions: .NET developers do care about performance. Those who care the most use a wider range of performance measurement methods than those who care less. But overall, logging statements, performance counters and third party performance profilers are the performance measurement methods of choice for most developers. Finally, although most of you find code profilers complex to use, those of you who care the most about performance tend to use profilers more than those of you to whom performance is not so important.

    Read the article

  • Some Adsense domain's ads are causing document.write() statements that remove the html from the page

    - by er1234
    All that is output on the page is the domain name of the advertiser, for example 'www.solar-aid.org'. The rest of the content is stripped, I believe because of a document.write() statement. I'd like to know if this is a common issue or something wrong with our setup. There are three domains causing the issue, which we've blocked from Adsense as a result. solar-aid.org kiva.org grameenfoundation.org Given the type of organizations I think they may be within the default group of 'public service ads' within the Backup Ads setting. If the issue doesn't completely resolve itself soon (one customer of ours complained today, even though I blocked them 5+ days ago), I'll disable public service ads and select the 'fill space with a solid color' option.

    Read the article

  • Virtual Brown Bag Recap: JB's New Gem, Patterns 101, Killing VS, CodeMav

    - by Brian Schroer
    At this week's Virtual Brown Bag meeting: JB showed off his new SpeakerRate Ruby gem Claudio alerted us to the Refactoring Manifesto We answered the question "How do I get started with Design Patterns?" Ever had to kill a frozen instance of Visual Studio? Yeah, I thought so. Claudio showed us how to do it with PowerShell. (It's faster) JB previewed his new CodeMav web site, which will be a social network for developers (integration with Speaker Rate, slide share, github, StackOverflow, etc.) For detailed notes, links, and the video recording, go to the VBB wiki page: https://sites.google.com/site/vbbwiki/main_page/2011-01-06

    Read the article

  • Visualize flowchart diagram with multiple end symbols

    - by platzhirsch
    I am looking for a standardize way to visualize the following hierarchical logic: The state of the thread is represented by the answers to the hierarchical set of question You can read this listing like a flowchart, you iterate over the questions decide, go one step deeper and so on. Therefore I thought the best way to visualize it, using a flowchart. The problem is, in this hierarchical set it is possible to end in more than one state and its totally valid. I have never seen a flowchart where you can enter more than one state. Is it still possible and I am missing the right symbol to present this logic or are flowchart not fitting anyway? What other graphical representation could I use, is there something fitting in UML? A non-deterministic state machine seems not to be intuitive enough, transfering it into a deterministic state machine would result in to many states, and so on.

    Read the article

  • 'Development dashboard' web application

    - by espais
    Hi all, I am not sure if something like this exists in that it is ready out of the box. I currently have some web space that I use for various projects, and I would like to setup an area for some friends and I to develop web applications together. My ideal setup would be to create a folder, say, webdev.domain.com. We could all go to this domain, login, and then be able to setup new applications, pick which language will be used, setup database tables, allow HTML based file uploading, and create sub-folders to basically have a test bed for the applications. In retrospect, it seems like I'm describing a limited version of cpanel. I could come up with something in Drupal I'm sure, but I don't want to have to really spend time configuring much. Like I said, I want to install it and have minimal configuration. Does something like this exist (preferably in open-source)?

    Read the article

  • Get More From Your Kindle: Tips, Tricks, Hacks, and Free Books

    - by Jason Fitzpatrick
    If you have an ebook reader chances are it’s a Kindle. Today we’re taking a look at ways you can get more from your Kindle using built-in tools, experimental features, and third party software. Read on to supercharge your Kindle experience. You might have bought your Kindle, used it to buy some titles from the Kindle store, and thought that’s all there was to Kindle ownership. Millions of Kindle owners are perfectly happy with that arrangement but you can squeeze much more life and enjoyment out of your Kindle by digging into the device, employing third party hacks and software bundles, and more. How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is CompromisedHow to Clean Your Filthy Keyboard in the Dishwasher (Without Ruining it)

    Read the article

  • Slide Creation Checklist

    - by Daniel Moth
    PowerPoint is a great tool for conference (large audience) presentations, which is the context for the advice below. The #1 thing to keep in mind when you create slides (at least for conference sessions), is that they are there to help you remember what you were going to say (the flow and key messages) and for the audience to get a visual reminder of the key points. Slides are not there for the audience to read what you are going to say anyway. If they were, what is the point of you being there? Slides are not holders for complete sentences (unless you are quoting) – use Microsoft Word for that purpose either as a physical handout or as a URL link that you share with the audience. When you dry run your presentation, if you find yourself reading the bullets on your slide, you have missed the point. You have a message to deliver that can be done regardless of your slides – remember that. The focus of your audience should be on you, not the screen. Based on that premise, I have created a checklist that I go over before I start a new deck and also once I think my slides are ready. Turn AutoFit OFF. I cannot stress this enough. For each slide, explicitly pick a slide layout. In my presentations, I only use one Title Slide, Section Header per demo slide, and for the rest of my slides one of the three: Title and Content, Title Only, Blank. Most people that are newbies to PowerPoint, get whatever default layout the New Slide creates for them and then start deleting and adding placeholders to that. You can do better than that (and you'll be glad you did if you also follow item #11 below). Every slide must have an image. Remove all punctuation (e.g. periods, commas) other than exclamation points and question marks (! ?). Don't use color or other formatting (e.g. italics, bold) for text on the slide. Check your animations. Avoid animations that hide elements that were on the slide (instead use a new slide and transition). Ensure that animations that bring new elements in, bring them into white space instead of over other existing elements. A good test is to print the slide and see that it still makes sense even without the animation. Print the deck in black and white choosing the "6 slides per page" option. Can I still read each slide without losing any information? If the answer is "no", go back and fix the slides so the answer becomes "yes". Don't have more than 3 bullet levels/indents. In other words: you type some text on the slide, hit 'Enter', hit 'Tab', type some more text and repeat at most one final time that sequence. Ideally your outer bullets have only level of sub-bullets (i.e. one level of indentation beneath them). Don't have more than 3-5 outer bullets per slide. Space them evenly horizontally, e.g. with blank lines in between. Don't wrap. For each bullet on all slides check: does the text for that bullet wrap to a second line? If it does, change the wording so it doesn't. Or create a terser bullet and make the original long text a sub-bullet of that one (thus decreasing the font size, but still being consistent) and have no wrapping. Use the same consistent fonts (i.e. Font Face, Font Size etc) throughout the deck for each level of bullet. In other words, don't deviate form the PowerPoint template you chose (or that was chosen for you). Go on each slide and hit 'Reset'. 'Reset' is a button on the 'Home' tab of the ribbon or you can find the 'Reset Slide' menu when you right click on a slide on the left 'Slides' list. If your slides can survive doing that without you "fixing" things after the Reset action, you are golden! For each slide ask yourself: if I had to replace this slide with a single sentence that conveys the key message, what would that sentence be? This exercise leads you to merge slides (where the key message is split) or split a slide into many, if there were too many key messages on the slide in the first place. It can also lead you to redesign a slide so the text on it really is just explanation or evidence for the key message you are trying to convey. Get the length right. Is the length of this deck suitable for the time you have been given to present? If not, cut content! It is far better to deliver less in a relaxed, polished engaging, memorable way than to deliver in great haste more content. As a rule of thumb, multiply 2 minutes by the number of slides you have, add the time you need for each demo and check if that add to more than the time you have allotted. If it does, start cutting content – we've all been there and it has to be done. As always, rules and guidelines are there to be bent and even broken some times. Start with the above and on a slide-by-slide basis decide which rules you want to bend. That is smarter than throwing all the rules out from the start, right? Comments about this post welcome at the original blog.

    Read the article

  • Billboard shader without distortion

    - by Nick Wiggill
    I use the standard approach to billboarding within Unity that is OK, but not ideal: transform.LookAt(camera). The problem is that this introduces distortion toward the edges of the viewport, especially as the field of view angle grows larger. This is unlike the perfect billboarding you'd see in eg. Doom when seeing an enemy from any angle and irrespective of where they are located in screen space. Obviously, there are ways to blit an image directly to the viewport, centred around a single vertex, but I'm not hot on shaders. Does anyone have any samples of this approach (GLSL if possible), or any suggestions as to why it isn't typically done this way (vs. the aforementioned quad transformation method)? EDIT: I was confused, thanks Nathan for the heads up. Of course, Causing the quads to look at the camera does not cause them to be parallel to the view plane -- which is what I need.

    Read the article

  • Bring the Whole Ubuntu Gang Home to Your Desktop with this Mascots Wallpaper

    - by Asian Angel
    This wonderful wallpaper features all of the Ubuntu Mascots together as stuffed animals and will make a perfect addition to your Ubuntu desktop. Ubuntu Wallpaper [via Web Upd8] Latest Features How-To Geek ETC Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions Access and Manage Your Ubuntu One Account in Chrome and Iron Mouse Over YouTube Previews YouTube Videos in Chrome Watch a Machine Get Upgraded from MS-DOS to Windows 7 [Video] Bring the Whole Ubuntu Gang Home to Your Desktop with this Mascots Wallpaper Hack Apart a Highlighter to Create UV-Reactive Flowers [Science] Add a “Textmate Style” Lightweight Text Editor with Dropbox Syncing to Chrome and Iron

    Read the article

  • Get your content off Blogger.com

    Due to blogger.com deprecating FTP users I've decided to move my blog. When I think of the content of a blog, 4 items come to mind: blog posts, comments, binary files that the blog posts linked to (e.g. images, ZIP files) and the CSS+structure of the blog. 1. Binaries The binary files you used in your blog posts are sitting on your own web space, so really blogger.com is not involved with that. Nothing for you to do at this stage, I'll come back to these in another post. 2. CSS and structure...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • DNS add-on domain setup and redirect

    - by brian
    I have several domains which I'd like to point to another (I'll call it foo.com). A couple of things aren't entirely clear to me. First, the DNS. I'm using Kloxo/HyperVM. Do I need to create separate DNS entries for each domain? Or do I just create separate CNAME or other records under foo.com? I thought it was the latter but when I click on "Add CNAME" I'm prompted to fill in the subdomain portion of foo.com. The nameservers have already been set to point to my VPS. For the redirect, would the following be appropriate within the vhost conf for foo.com? ServerName www.foo.com ServerAlias foo.com foo.net foo.org bar.com bar.net bar.org RewriteCond %{HTTP_HOST} ^foo.com [NC] RewriteCond %{HTTP_HOST} *foo.net [NC,OR] RewriteCond %{HTTP_HOST} *foo.org [NC,OR] RewriteCond %{HTTP_HOST} *bar.com [NC,OR] RewriteCond %{HTTP_HOST} *bar.net [NC,OR] RewriteCond %{HTTP_HOST} *bar.org [NC] RewriteRule ^(.*)$ http://www.foo.com/$1 [R=301,NC] (The first condition is just to force the "www" part)

    Read the article

  • Azure Boot Camp

    - by Brian Schroer
    Belated thanks to Perficient for sponsoring (and providing lunch, which was a nice unadvertised surprise) and to Avichal Jain and Brian Blanchard for presenting at the St. Louis Azure Boot Camp May 13-14. There was a little more upfront discussion of “What is Cloud Computing and Why is it important?” than I thought necessary (I would think that people signing up for a two-day Azure event would already be convinced that it’s a worthwhile thing), but we put on our boots and fired up Visual Studio soon enough. The good news for developers, as with most of Microsoft’s recent initiatives (e.g Silverlight and Windows Phone 7 development), is that you can leverage the skills you already have. If you’ve developed service-oriented applications, you’ve got a big head start. If a free Azure Boot Camp event is coming to your area (here’s the schedule), be sure to check it out. If not, you can download the slides and labs from their web site and “throw your own”.

    Read the article

< Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >