Search Results

Search found 22625 results on 905 pages for 'must do better'.

Page 238/905 | < Previous Page | 234 235 236 237 238 239 240 241 242 243 244 245  | Next Page >

  • I can't get a native resolution of 1920x1080 on 11.10 (AOC f22 on a Nvidia Geforce GTS 450)

    - by Mikeeeee
    I have a problem were the highest resolution I can get is 1360x769, this is a 22 inch LCD monitor with a native resolution of 1920x1080_60 I have tried numerous drivers but nothing changed I tried editing the xorg.conf scipt with no success (I am a noob with linux though). Running many commands in terminal witch I got from people with similar problems only gives me errors like "Failed to get size of gamma for output default. I get edid checksum is invalid error on boot down also. I think there maybe a communication problem between my screens EDID and ubuntu although xp and windows 7 detect my screen without any errors and automatically set native resolution. also when I am installing ubuntu I get a horrible screen flashing every few seconds until I have installed the nvidia driver. pc specks if it helps x64 os, mainboard N68PV-GS, 4 gig ram, AMD Phenom(tm) 9350e Quad-Core Processor × 4, Nvidia Geforce gts450 512mb, hard drives set up in a onboard nvidia raid array striped. realy need to get a better resolution, 1360x769 does not look nice on a 22 inch screen. ty

    Read the article

  • My website google index suddenly increase and also suddenly reduced

    - by Jeg Bagus
    Yesterday before i sleep, i check my site index. i get about 50 index on google. today morning when i wake up, i get 250 index on google. and my page ranking better on several keyword. than i add 1 page and 2 canonical link, add 404 page header, and resubmit sitemap. and after 2 hour, its going down to 50 index again. and my page ranking just rolled back to previous day. what is actually happen? is it because i resubmit sitemap? until now, google still crawl my website. do they try to refresh the index?

    Read the article

  • Links in my site have been hacked

    - by Funky
    In my site I prefix the images and links with the domain of the site for better SEO using the code below: public static string GetHTTPHost() { string host = ""; if (HttpContext.Current.Request["HTTP_HOST"] != null) host = HttpContext.Current.Request["HTTP_HOST"]; if (host == "site.co.uk" || host == "site.com") { return "http://www." + host; } return "http://"+ host; } This works great, but for some reason, lots of links have now changed to http://www.baidu.com/... There is no sign of this in any of the code or project, the files on the server also have a change date when i last did the publish at 11 yesterday, so all the files on there look fine. I am using ASP.net and Umbraco 4.7.2 Does anyone have any ideas? thanks

    Read the article

  • I've totally missed the point of distributed vcs [closed]

    - by NimChimpsky
    I thought the major benefit of it was that each developers code gets stored within each others repository. My impression was that each developer has their working directory, their own repository, and then a copy of the other developers repository. Removing the need for central server, as you have as many backups as you have developers/repositories Turns out this is nto the case, and your code is only backed up (somewhere other than locally) when you push, the same as a commit in subversions. I am bit disappointed ... hopefully I will be pleasantly surprised when it handles merges better and there are less conflicts ?

    Read the article

  • Is there any way to simulate a slow connection between my server and an iPad (without installing anything on the server)?

    - by Clay Nichols
    Some of our webapp users have difficulty on slower connections. I"m trying to get a better idea of what that "speed barrier is" so I'd like to be able to test a variety of connection speeds. I've found ways to do this on Windows but no on the iPad, so I'm looking more for some sort of proxy service that'll work with any device (not running ON that device) I did find an article about using the CharlesProxy and providing a connection to another device, but I was hoping for something simpler (need not be free) Constraints * We are on a shared server so we can't install anything and we are limited in our control over that server. * I'd like to test an iPad, Android Tablet, Windows PC.

    Read the article

  • Upgrading an app to support iOS5, 6 and 7

    - by drekka
    We are looking at an app that needs an upgrade. Currently it runs on iOS4, 5 & 6. The upgrade will move to iOS5, 6 & 7. It will also involve some UI changes and new features. I've been reading stuff on iOS7 and looking at things like auto-layout. What we are trying to figure out is the best way to handle the differences between the various iOS versions. Auto-layout seems like a good idea, but it's not available on iOS 5. There are also API changes to consider between all 3 versions and other new features of iOS7. So the questions: How would you handle auto layout given iOS5 does not have it? Are there any significant differences between the SDKs that you think would cause issues? Would we be better off with separate code bases?

    Read the article

  • How can you represent equip-able items in a 2d game?

    - by ThePlan
    I've been working on an item system for a post-apocalyptic RPG, with diablo as inspiration, and it would be awesome if I could visually represent an item that can be equipped on the player sprite. I was thinking you could have a player sprite with certain animations, then the equipped item would be drawn as if it was on the player with the same animations, so it syncs with the player animations but that couldn't work very smoothly, I imagine there's a better system. How can you graphically represent an item worn on the player, which moves like he does, and looks as if he's wearing it? I'm not asking you how to do it in framework X or platform X (altho if you REALLY need it, I'm using Allegro 5 with codeblocks on win XP) but instead I'm asking you how to generally program such an idea.

    Read the article

  • AWS CloudFormations, Oracle Assembly Builder, Chef and Puppet

    - by llaszews
    I blogged about the difference and similarities between AWS CloudFormations and Oracle Assembler builder to package your software stack for deployment/provisioning to the cloud. However, these tools do not deal with software stack versioning and configuration management. This is where tools like Chef and Puppet come into play. Puppet and Chef points of interest: 1. Can be used in any cloud environment (rackspace, private cloud etc). 2. There is a debate between which is better. I am not going to get into this debate other then to say Puppet is more mature. 3. AWS CloudFormations can integration with both Chef and Puppet. A good blog on AWS CloudFormations and the need for something more: AWS CloudFormation

    Read the article

  • I'm looking for websites with programming problems that I can practice [on hold]

    - by Spentak
    I want to become a more skilled programmer. I also want to do it through Objective-C and iOS. What websites or programs that you know of have problems that I can solve (with the answers)? I have failed some programming tests for jobs (such as "Given 2 values in a Binary Tree - how do you find the lowest common ancestor?) and I want to become a better engineer. I have developed 54 iOS/Android apps to date, but my core CS Skills apparently are rusty/bad. I have looked at TopCoder - but there aren't very many competitions going on, the website is terrible, and there does not appear to be anything that really supports Objective-C/iOS Websites?

    Read the article

  • How to generate "language-safe" UUIDs?

    - by HappyDeveloper
    I always wanted to use randomly generated strings for my resources' IDs, so I could have shorter URLs like this: /user/4jz0k1 But I never did, because I was worried about the random string generation creating actual words, eg: /user/f*cker. This brings two problems: it might be confusing or even offensive for users, and it could mess with the SEO too. Then I thought all I had to do was to set up a fixed pattern like adding a number every 2 letters. I was very happy with my 'generate_safe_uuid' method, but then I realized it was only better for SEO, and worse for users, because it increased the ratio of actual words being generated, eg: /user/g4yd1ck5 Now I'm thinking I could create a method 'replace_numbers_with_letters', and check that it haven't formed any words against a dictionary or something. Any other ideas? ps. As I write this, I also realized that checking for words in more than one language (eg: english and french, spanish, etc) would be a mess, and I'm starting to love numbers-only IDs again.

    Read the article

  • I wan't to make PC for library. And have some problem ))

    - by Doroff
    I use Ubuntu 12.04. For make .desktop I used this instructions: http://www.instructables.com/id/Setting-Up-Ubuntu-as-a-Kiosk-Web-Appliance/step4/Set-up-Chromium/ 1 problem: No users can't download kiosk.desktop - they download ubuntu.desktop and change that properties in home/user/.dmrc . How can I fix that problem? Once I put all properties that I maked for kiosk.desktop into ubuntu.desktop and it's start work...but on every created users, and after I reinstalled system. 2 problem: Can I write in .desktop which program users can use? If yes-how? 3 problem: Which programm is better to use in proxy for Ubuntu 12.04? Sorry for my english and thanks Yuri

    Read the article

  • Whats steps can I suggest to achieve the best Geolocation Result [migrated]

    - by Matt
    We are using Geolocation (getCurrentPosition()) in a website to determine a users position when using our site from a mobile device. I want to write an article explaining how the user can obtain the best results. Am I correct in assuming: Enabling GPS will yield the best result when in rural areas (less buildings to obscure line of sight to the satelites) Enabling Wi-Fi will yield the best results when in urban areas (generally more Wi-Fi hotspots available) Is it true that Android phones have better results from silently harvesting Wi-Fi hotspot details? Any links to reference material on this are appreciated

    Read the article

  • VS2010 tip: Cannot find the HttpUtility on .NET 4.0?

    Now that Visual Studio 2010 is here with more options and better colors, there are a few things that have changed a little and might confused you a bit. For example the good HttpUtility, if you are creating a Windows Console application or a Windows Service and you want to use that awesome class full of goodies, you would try to go to reference and add System.Web.dll, yet in the list that dll will be missing. So did they removed that dll in .NET 4? Not really, still there however there are not profiles...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Linux OpenGL programming, should I use GLX or any other?

    - by pahnin
    I'm new to OpenGL and found that there are a lot of libraries to do that in C, and I also found that glx is most friendly with Linux X Server, I just want to do basic stuff, and I cannot find any tutorials for GLX. Is GLX a bad thing? I just want to do some small graphical things without installing many libraries and getting confused. Can anyone suggest me something which has tutorials and simple to compile? I found a link with an example with GLX and it worked perfect with no errors: anyone please suggest where I can find nice documentation or any better libraries.

    Read the article

  • Database Partitioning and Multiple Data Source Considerations

    - by Jeffrey McDaniel
    With the release of P6 Reporting Database 3.0 partitioning was added as a feature to help with performance and data management.  Careful investigation of requirements should be conducting prior to installation to help improve overall performance throughout the lifecycle of the data warehouse, preventing future maintenance that would result in data loss. Before installation try to determine how many data sources and partitions will be required along with the ranges.  In P6 Reporting Database 3.0 any adjustments outside of defaults must be made in the scripts and changes will require new ETL runs for each data source.  Considerations: 1. Standard Edition or Enterprise Edition of Oracle Database.   If you aren't using Oracle Enterprise Edition Database; the partitioning feature is not available. Multiple Data sources are only supported on Enterprise Edition of Oracle   Database. 2. Number of Data source Ids for partitioning during configuration.   This setting will specify how many partitions will be allocated for tables containing data source information.  This setting requires some evaluation prior to installation as       there are repercussions if you don't estimate correctly.   For example, if you configured the software for only 2 data sources and the partition setting was set to 2, however along came a 3rd data source.  The necessary steps to  accommodate this change are as follows: a) By default, 3 partitions are configured in the Reporting Database scripts. Edit the create_star_tables_part.sql script located in <installation directory>\star\scripts   and search for partition.  You’ll see P1, P2, P3.  Add additional partitions and sub-partitions for P4 and so on. These will appear in several areas.  (See P6 Reporting Database 3.0 Installation and Configuration guide for more information on this and how to adjust partition ranges). b) Run starETL -r.  This will recreate each table with the new partition key.  The effect of this step is that all tables data will be lost except for history related tables.   c) Run starETL for each of the 3 data sources (with the data source # (starETL.bat "-s2" -as defined in P6 Reporting Database 3.0 Installation and Configuration guide) The best strategy for this setting is to overestimate based on possible growth.  If during implementation it is deemed that there are atleast 2 data sources with possibility for growth, it is a better idea to set this setting to 4 or 5, allowing room for the future and preventing a ‘start over’ scenario. 3. The Number of Partitions and the Number of Months per Partitions are not specific to multi-data source.  These settings work in accordance to a sub partition of larger tables with regard to time related data.  These settings are dataset specific for optimization.  The number of months per partition is self explanatory, optimally the smaller the partition, the better query performance so if the dataset has an extremely large number of spread/history records, a lower number of months is optimal.  Working in accordance with this setting is the number of partitions, this will determine how many "buckets" will be created per the number of months setting.  For example, if you kept the default for # of partitions of 3, and select 2 months for each partitions you would end up with: -1st partition, 2 months -2nd partition, 2 months -3rd partition, all the remaining records Therefore with records to this setting, it is important to analyze your source db spread ranges and history settings when determining the proper number of months per partition and number of partitions to optimize performance.  Also be aware the DBA will need to monitor when these partition ranges will fill up and when additional partitions will need to be added.  If you get to the final range partition and there are no additional range partitions all data will be included into the last partition. 

    Read the article

  • What is a good university for computer science and game development?

    - by DukeYore
    I am starting my computer science degree at a local community college in programming using C++. However, I will be transferring to a 4-year university. Does anyone have any insight on university programs? I know Cal State Fullerton has a degree with a minor in Game Development. however, is that as important as getting a degree from a really great school? If I could shoot for something like Cal Poly would that be better? Or even Stanford or SF State being so close to so many gaming companies up there in the Bay area?

    Read the article

  • How to run ubuntu-tweak's janitor automatically?

    - by Eliran Malka
    My aim is to have the janitor running at startup, with a pre-configured profile (e.g. clear redundant packages and browser cache). The website is lacking any documentation or usage instructions, and I could not find any information on this here as well. I tried, naturally, to start ubuntu-tweak from the command line, hoping additional API exists that will come through in this (allegedly) simple task. I only got as far as: ubuntu-tweak -f janitor which is a step in the right direction, but what's still missing is a command for the clear action. Is such a command available, or is there any better way of achieving the desired behavior?

    Read the article

  • If you should only have one assertion per test; how to test multiple inputs?

    - by speg
    I'm trying to build up some test cases, and have read that you should try and limit the number of assertions per test case. So my question is, what is the best way to go about testing a function w/ multiple inputs. For example, I have a function that parses a string from the user and returns the number of minutes. The string can be in the form "5w6h2d1m", where w, h, d, m correspond to the number of weeks, hours, days, and minutes. If I wanted to follow the '1 assertion per test rule' I'd have to make multiple tests for each variation of input? That seems silly so instead I just have something like: self.assertEqual(parse_date('5m'), 5) self.assertEqual(parse_date('5h'), 300) self.assertEqual(parse_date('5d') ,7200) self.assertEqual(parse_date('1d4h20m'), 1700) In the one test case. Is there a better way?

    Read the article

  • Will search engines ever change to allow longer title and description tags? [closed]

    - by guisasso
    I was just wondering: The standard title length is 64 characters, while meta description tags are 150-160. I was thinking, that it was probably done like that originally because of screen resolutions back in the day, that could not really fit a lot of content. Google still displays search results in a incredible small resolution fixed to the left side of the browser, and it's simplicity is probably what makes it so popular. With websites such as bing, displaying a richer more vivid search experience, in your opinion, will search engines ever change to accept better and longer meta description tags and titles? (I'm asking because we work to accommodate their standards, but what if they change?)

    Read the article

  • Given two sets of DNA, what does it take to computationally "grow" that person from a fertilised egg and see what they become? [closed]

    - by Nicholas Hill
    My question is essentially entirely in the title, but let me add some points to prevent some "why on earth would you want to do that" sort of answers: This is more of a mind experiment than an attempt to implement real software. For fun. Don't worry about computational speed or the number of available memory bytes. Computers get faster and better all of the time. Imagine we have two data files: Mother.dna and Father.dna. What else would be required? (Bonus point for someone who tells me approx how many GB each file will be, and if the size of the files are exactly the same number of bytes for everyone alive on Earth!) There would ideally need to be a way to see what the egg becomes as it becomes a human adult. If you fancy, feel free to outline the design. I am initially thinking that there'd need to be some sort of volumetric voxel-based 3D environment for simulation purposes.

    Read the article

  • Is there an equivalent of RDP?

    - by detly
    The "Desktop Sharing" settings that come installed by default seem to use VNC. VNC is a bit of a bandwidth hog, can only work at the resolution of whatever screen is attached to the host, and mirrors every action on the host. (It also seems to work poorly with compositing, but maybe that's been fixed.) I know about X tunnelling, but that's annoying to use and doesn't always work properly (or, more accurately, some apps don't work properly). Is there any kind of protocol in between the two, similar to RDP used for Windows? Specifically, something that can run at a different resolution to the host screen and is a little lighter on the network? (Ideally, the more the protocol could have in common with RDP, the better.)

    Read the article

  • Help us with our git workflow

    - by Brandon Cordell
    We have a web application that gets deployed to multiple regions around our state. An instance of the application for each region. We maintain a staging and production (master) branch in our repository, but we were wondering what is the best way of maintaining each instances codebase. It's similar at the core, but we have to give each region the ability to make specific requests that may not make it into the core of the application. Right now we have branches for each region, like region_one_staging, and region_one_production. At the rate we're growing we'll have hundreds of branches here in the next few years. Is there a better way to do this?

    Read the article

  • C# on ubuntu 12.04

    - by Deus Deceit
    Is C# a good choice for ubuntu programming? E.g unity, or applications that will run on ubuntu? Am I doing good wanting to learn C# when I'm determined to stick with ubuntu and develop on it or for it? If not, can you give me reasons why? And which languages would be better than c# for ubuntu development? I already know c, c++, java(basics), php, mysql, python(basics). I like to learn new stuff, but stuff that worth my time. Does C# worth my time? If c# worth my time, here's what I have done and what I need: I installed all mono packages I could find on the ubuntu standard repositories. Now I want a good tutorial to get me started. I'm a complete noob with c# so a basic tutorial and how to compile run under ubuntu 12.04 would be great.

    Read the article

  • Grpahic hardwares

    - by Vanangamudi
    Which vendor provides better GPGPU. my requirements are confined to rendering utilising the GPU for BSDF building for e.g. Intel started providing Ivy Bridge chipset GPU, which are comparably fast to HD5960 cards. I'm not that against nvidia or amd. but I'm a fan of Intel. how it compares to nvidia in price and performance. if possible may I know, how all of them perform with OpenCL?? I'm not sure if it is right to ask it here. but I don't know where to ask.

    Read the article

  • D-fense! D-fense! ...for Java technology

    - by hinkmond
    Who needs defense when computing with the Java platform? Isn't "the best defense is a good offense"? At least in football and volleyball... See: The Best Defense Here's a quote: "The other Oracle tester page, Verify Java Version, consistently reports whether the latest version is installed. Just click the big red button to see if the 'recommended' version of Java is installed." So, go ahead and use Java technology! There is "nothing to fear but fear itself". I like that quote better. Hinkmond

    Read the article

< Previous Page | 234 235 236 237 238 239 240 241 242 243 244 245  | Next Page >