Search Results

Search found 22625 results on 905 pages for 'must do better'.

Page 67/905 | < Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >

  • What hinders Ubuntu from getting traction in the professional field? [closed]

    - by Prasad
    If this is not the place to ask this, please forgive this Ubuntu cub, I want to ask, what do people do with Ubuntu? As an Ask Ubuntu user I can see that most of the users (including myself) are asking questions about entertainment related problems. Is that all? No commercial use with it? Do people make fun of Ubuntu or just pretending to be Ubuntu users and use Windows secretly? Please don't hate me or make fun of me, I know lots of people trying to make Ubuntu even better, and I know it's better than Windows (if Adobe software just work on Ubuntu, I won't see Windows logo on my monitor anymore). What hinders Ubuntu from getting traction in the professional field?

    Read the article

  • Alternatives to Project Euler for improving Excel ability

    - by Jonathan Deamer
    I've recently been enjoying using the mathematical problems listed at Project Euler to learn Python. My Excel ability is better than my Python, but I think I'd still benefit from the sort of inductive learning that comes with solving a series of increasingly difficult puzzles using a particular tool. I know Project Euler can be completed using Excel, but are there any other puzzle series similar to this or The Python Challenge specifically tailored for people trying to increase their knowledge of Excel and what it can do? NB. I'm not looking for a "tutorial", I know there are plenty of these. And apologies if this isn't completely appropriate for programmers.SE.com - some of the folks at SuperUser suggested it was a better fit here than there!

    Read the article

  • How To Initialize Object Which May Be Used In Catch Clause?

    - by Onorio Catenacci
    I've seen this sort of pattern in code before: //pseudo C# code var exInfo = null; //Line A try { var p = SomeProperty; //Line B exInfo = new ExceptionMessage("The property was " + p); //Line C } catch(Exception ex) { exInfo.SomeOtherProperty = SomeOtherValue; //Line D } Usually the code is structured in this fashion because exInfo has to be visible outside of the try clause. The problem is that if an exception occurs on Line B, then exInfo will be null at Line D. The issue arises when something happens on Line B that must occur before exInfo is constructed. But if I set exInfo to a new Object at line A then memory may get leaked at Line C (due to "new"-ing the object there). Is there a better pattern for handling this sort of code? Is there a name for this sort of initialization pattern? By the way I know I could check for exInfo == null before line D but that seems a bit clumsy and I'm looking for a better approach.

    Read the article

  • Sharing VBO with multiple objects and fixed size buffer data

    - by Mark Ingram
    I'm just messing around with OpenGL and getting some basic structures in place and my first attempt resulted in each SceneObject class (just contains vertex information right now) having it's own VBO inside it, however I've read that it might be better to share VBOs across multiple objects. Also, I read that you should avoid resizing a VBO (repeated calls to glBufferData with different size parameters), and instead choose a fixed size for a VBO, and just try a range from the buffer. I don't think changing the size of the buffer data would happen too often, but surely it would be better to only allocate the data you need? Choosing an arbitrary value seems risky. I'm looking for some advice on working with individual objects in a scene and their associated buffer data.

    Read the article

  • SQLIO Writes

    - by Grant Fritchey
    SQLIO is a fantastic utility for testing the abilities of the disks in your system. It has a very unfortunate name though, since it's not really a SQL Server testing utility at all. It really is a disk utility. They ought to call it DiskIO because they'd get more people using I think. Anyway, branding is not the point of this blog post. Writes are the point of this blog post. SQLIO works by slamming your disk. It performs as mean reads as it can or it performs as many writes as it can depending on how you've configured your tests. There are much smarter people than me who will get into all the various types of tests you should run. I'd suggest reading a bit of what Jonathan Kehayias (blog|twitter) has to say or wade into Denny Cherry's (blog|twitter) work. They're going to do a better job than I can describing all the benefits and mechanisms around using this excellent piece of software. My concerns are very focused. I needed to set up a series of tests to see how well our product SQL Storage Compress worked. I wanted to know the effects it would have on a system, the disk for sure, but also memory and CPU. How to stress the system? SQLIO of course. But when I set it up and ran it, following the documentation that comes with it, I was seeing better than 99% compression on the files. Don't get me wrong. Our product is magnificent, wonderful, all things great and beautiful, gets you coffee in the morning and is made mostly from bacon. But 99% compression. No, it's not that good. So what's up? Well, it's the configuration. The default mechanism is to load up a file, something large that will overwhelm your disk cache. You're instructed to load the file with a character 0x0. I never got a computer science degree. I went to film school. Because of this, I didn't memorize ASCII tables so when I saw this, I thought it was zero's or something. Nope. It's NULL. That's right, you're making a very large file, but you're filling it with NULL values. That's actually ok when all you're testing is the disk sub-system. But, when you want to test a compression and decompression, that can be an issue. I got around this fairly quickly. Instead of generating a file filled with NULL values, I just copied a database file for my tests. And to test it with SQL Storage Compress, I used a database file that had already been run through compression (about 40% compression on that file if you're interested). Now the reads were taken care of. I am seeing very realistic performance from decompressing the information for reads through SQLIO. But what about writes? Well, the issue is, what does SQLIO write? I don't have access to the code. But I do have access to the results. I did two different tests, just to be sure of what I was seeing. First test, use the .DAT file as described in the documentation. I opened the .DAT file after I was done with SQLIO, using WordPad. Guess what? It's a giant file full of air. SQLIO writes NULL values. What does that do to compression? I did the test again on a copy of an uncompressed database file. Then I ran the original and the SQLIO modified copy through ZIP to see what happened. I got better than 99% compression out of the SQLIO modified file (original file of 624,896kb went to 275,871kb compressed, after SQLIO it went to 608kb compressed). So, what does SQLIO write? It writes air. If you're trying to test it with compression or maybe some other type of file storage mechanism like dedupe, you need to know this because your tests really won't be valid. Should I find some other mechanism for testing? Yeah, if all I'm interested in is establishing performance to my own satisfaction, yes. But, I want to be able to compare my results with other people's results and we all need to be using the same tool in order for that to happen. SQLIO is the common mechanism that most people I know use to establish disk performance behavior. It'd be better if we could get SQLIO to do writes in some other fashion. Oh, and before I go, I get to brag a bit. Measuring IOPS, SQL Storage Compress outperforms my disk alone by about 30%.

    Read the article

  • What criteria would I use SQL Stream Insight vs TPL Dataflow [closed]

    - by makerofthings7
    There is an add-in to the Task Parallel Library (TPL) called TPL Dataflow that allows a variety of data processing scenarios. It seems that there are some parallels to the SQL Stream Insight product, however since SQL's Stream Insight has some interesting licensing around it, and it has a better performance depending on what license I get... I found myself asking myself should I use TPL Dataflow and not have any licensing issues, and possibly better performance. Can anyone tell me if performance is a valid criteria for comparing SQL Stream Insight vs TPL Dataflow? What other criteria should I be looking at when comparing the two?

    Read the article

  • Java Developers: Open-source Modules, Great Tools, Opportunity.

    - by Paul Sorensen
    The role of Java developer may just be better than ever. An excellent article in Java Magazine discusses the availability of web-based tools that help development teams more effectively manage their projects and modules. If you are a Java developer you should definitely read this article. I especially like the Expert Opinions scattered throughout the article. These highlight real-world usage of the latest and greatest development tools.  As you consider steps to move your career forward, consider Java certification. Oracle has over 15 unique Java certification credentials available. The process of becoming certified in Java and preparing for your exams will require you to study, learn and practice (code). All of this activity will help you sharpen your skills and increase your working knowledge of Java - making you a better developer and more valuable member of your team. You can use the Certification Finder on the Oracle certification homepage to find a Java certification that is right for you. Thanks! 

    Read the article

  • Trim on encrypted SSD--Urandom first?

    - by cb474
    My understanding (I'm not sure I'm getting this all right) is that if one uses Trim on an encrypted SSD, it defeats some of the security benefits, because the drive will write zeros to empty space (as files are deleted). See: http://www.askubuntu.com/questions/115823/trim-on-an-encrypted-ssd And: http://asalor.blogspot.com/2011/08/trim-dm-crypt-problems.html My question is: From the perspective of the performance of the SSD and the functioning of Trim, would it therefore be better to simply zero out the SSD, before setting up an encrypted system, rather than writing random data to the drive, with urandom, as one usually does? Would this basically leave one with the same level of security anyway? And more importantly, would it better enable the Trim functionality to work as intended, with the encrypted SSD?

    Read the article

  • Create association between informations

    - by Andrea Girardi
    I deployed a project some days ago that allow to extract some medical articles using the results of a questionnaire completed by a user. For instance, if I reply on questionnaire I'm affected by Diabetes type 2 and I'm a smoker, my algorithm extracts all articles related to diabetes bubbling up all articles contains information about Diabetes type 2 and smoking. Basically we created a list of topic and, for every topic we define a kind of "guideline" that allows to extract and order informations for a user. I'm quite sure there are some better way to put on relationship two content but I was not able to find them on network. Could you suggest my a model, algorithm or paper to better understand this kind of problem and that helps me to find a faster, and more accurate way to extract information for an user?

    Read the article

  • Database for survey

    - by zfm
    One of my job now is to design a database for a survey. Let's say we have a series of questions (web-based), in which one page contains one question. Not every person will be given the same questions, those are based on their previous answers and also randomness. I would like to know whether it is better to have database like this user question answer userX question1 answer1A userX question2 answer2C userX question5 answer5F userY question1 answer1B userY question3 answer3B userY question6 answer6D ... or user q1 q2 q3 q4 q5 q6 userX 1A 2C null null 5F null userY 1B null 3B null null 6D ... My idea here is, using the second approach seems better, however I would like to know whether updating the table is (much) slower than inserting a new row? Also with the first approach, I can omit having some null answers. The total questions given are fix, the client wont add any more question later on. So my question is, what will you do if you were me?

    Read the article

  • JavaFX Makeover for JFugue Music NotePad

    - by Geertjan
    Bengt-Erik Fröberg from Sweden, one of the developers working on ProSang, the leading Scandinavian blood bank system (and based on the NetBeans Platform), is reworking the user interface of the JFugue Music NotePad. In particular, the Score window (named ScoreFX window below) contains components that are now quite clearly JavaFX, instead of Swing. Looks a lot better and also performs better. The sliders in the Keyboard window are candidates for being similarly redone to use JavaFX instead of Swing. Want to do something similar? Here's all the info you need: http://platform.netbeans.org/tutorials/nbm-javafx.html

    Read the article

  • Wheel Joint Implementation in AndEngine

    - by Siddharth
    I am currently developing car game in AndEngine. In which I was using revolute joint for car wheel and chassis attachment. But my friend suggest me that use wheel joint for that purpose for better behavior of the car. In AndEnginen I didn't found the wheel joint implementation. So what I have to do for wheel joint implementation. I think I have to manually update the box2d library for this purpose but I don't know how many things get updated. Please suggest me some guidance on achieving better car behavior in AndEngine.

    Read the article

  • How many lines of code can a C# developer produce per month?

    - by lox
    An executive at my workplace asked me and my group of developers the question: How many lines of code can a C# developer produce per month? An old system was to be ported to C# and he would like this measure as part of the project planning. From some (apparently creditable) source he had the answer of "10 SLOC/month" but he was not happy with that. The group agreed that this was nearly impossible to specify because it would depend on a long list of circumstances. But we could tell that the man would not leave (or be very disappointed in us) if we did not come up with an answer suiting him better. So he left with the many times better answer of "10 SLOC/day" Can this question be answered? (offhand or even with some analysis)

    Read the article

  • Getting Links from High PR Forums to Promote Websites

    - by Akito
    I have started [link removed] regarding Apple and its products. Its been about 3 months and the blog is running fine. Its PR2 for now. I need some backlinks from high PR websites so that the SERP becomes better. I tried an SEO service but it wasn't good so now I am thinking to contact people on high PR Forums to help me by putting signature of my website. I have the following websites in my mind SitePoint Forums DigitaPoint Forums Adobe Forums Apple Forums Now, as my website is from Apple Niche so would it be better to prefer Apple Forums over other forums?

    Read the article

  • Few New Features Added to Geekswithblogs.net

    - by Staff of Geeks
    After reviewing some of the feedback from our bloggers we added a couple new features to Geekswithblogs.net and there are still more to come.  Here is a list of the features we added.   Fixed the Twitter parser to better support URLs and Hash Tags Added some hooks behind the scenes to tags posts with common keywords automatically Added Facebook likes and Tweets to the bottom of every post Cleaned up a few skins Images on the main page for bloggers who use Gravatar or Twitter integration Random bug fixes based on Log   We are definitely working to make Geekswithblogs.net faster and better.  If you have any suggestions, please feel free to share them with the team.  On a side note, if that suggestion is move to WordPress, I will reply to you with stop writing ASP.NET for your day job and move to PHP.  That request is the equivalent in my eyes.  If we have enough bloggers leave the Microsoft .NET Platform for their main source of income, we might consider it.   Technorati Tags: Geekswithblogs.net,Features,Version 4.0

    Read the article

  • Services - Separate Sites or One Site - Impact on SEO

    - by Lynda
    I have a client who is a lawyer that specializes in Criminal Defense and DUI, however, he does not show up well in Google. In researching the sites that rank better have much more content for those specialties than his site does and my thought it that he needs to add more quality content to rank better for those searches. On his site he mentions his specialties, but also he has various personal things on his sites that reflect his interest. These are clearly separated from the business portion. My questions are should he 1) separate his personal information into a new domain and 2) should he have a separate URL for each of his specialties? OR would one URL work as long as everything is clearly separated? I read once that for legal services to rank well you should make a separate site for each specialty and have that site focus solely on that service.

    Read the article

  • Update drivers for TL-WN851ND

    - by Tony_GPR
    Today I bought a new PCI wireless card, TP-Link WN851ND, with Atheros AR9227 chipset. It has 2 antennas and is compatible with Wifi N so I thought it would improve the quality of the signal. But after install it on my computer the result is the opposite to expected. It doesn't connect to my network, while my old Wifi BG card connect without problems, I created an access point from my smartphone to try the card, and work, but is very slow loading pages. In Windows 7 works perfectly, so I think the problem is the driver. I have Ubuntu 12.04 LTS with kernel 3.2.0.31, is there a way to update the driver or I can apply a patch to improve the performance of the card? Otherwise, anyone know if there is a work in progress to improve compatibility with this chipset, or is it better to change the card and buy one with better driver compatibility. And finally, which wireless N compatible chipsets have good support under Linux/Ubuntu?

    Read the article

  • Google ranking - Modal views - google analytics events [duplicate]

    - by minchiya
    This question already has an answer here: How to diagnose a search engine ranking drop? 5 answers I modified a site recently : - I added many google analytics events, to better understand user behaviour. - I added also two buttons on almost all the pages of the site. Those buttons show modal-views (I am using bootstrap) with questions about user opinion. This modals views are on almost all pages of the site. After this modification the ranking of the site decreased on google search from the second place to the seconde page :( Is it the events-collected or the model-views added ? If the model-views are the reason, then how to better do similar surveys ? Did you have please similar experience, or explanation to this ? Perhaps it is the effect of panda4 update. In this cas, what can I look for to improve the site. How to debug the problem/reasons ?

    Read the article

  • How to recognize my performance plateau?

    - by Dat Chu
    Performance plateau happens right after one becomes "adequately" proficient at a certain task. e.g. You learn a new language/framework/technology. You become better progressively. Then all of the sudden you realize that you have spent quite some time on this technology and you are not getting better at it. As a programmer who is conscious about my performance/knowledge/skill, how do I detect when I am in a performance plateau? What can I do to jump out of it (and keep going upward)?

    Read the article

  • large product image structured data and visibility

    - by Mark Resølved
    On an eCommerce site we two images for a product. One medium sized shown on top of the page and one large photo shown on click in an overlay. We use http://schema.org/Product microdata on the page. We'd like the large, initially hidden, photo to be the main image for the product, as it's the better looking one. So it's also referenced in the XML sitemap as <image:image>. So we also put the itemprop"image" attribute on the, hidden large image. But i'm wondering is it a bad idea to use a microdata attribute on a hidden style="display:none;" element? is there a better way to embed the main image in terms of SEO, without showing it initially?

    Read the article

  • Integrated ads in phone apps - how to avoid wasting battery?

    - by Jarede
    Considering the PCWorld review that came out in March: Free Android Apps Packed with Ads are Major Battery Drains ...Researchers from Purdue University in collaboration with Microsoft claim that third-party advertising in free smartphone apps can be responsible for as much as 65 percent to 75 percent of an app's energy consumption... Is there a best practice for integrating advert support into mobile applications, so as to not drain user battery too much? ...When you fire up Angry Birds on your Android phone, the researchers found that the core gaming component only consumes about 18 percent of total app energy. The biggest battery suck comes from the software powering third-party ads and analytics accounting for 45 percent of total app energy, according to the study... Has anyone invoked better ways of keeping away from the "3G Tail", as the report puts it? Is it better/possible to download a large set of adverts that are cached for a few hours, and using them to populate your ad space, to avoid constant use of the Wi-Fi/3G radios? Are there any best practices for the inclusion of adverts in mobile apps?

    Read the article

< Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >