Search Results

Search found 24350 results on 974 pages for 'bug a lot'.

Page 547/974 | < Previous Page | 543 544 545 546 547 548 549 550 551 552 553 554  | Next Page >

  • Will USMT 4.0 in MDT 2010 Move/Migrate the .NK2 File for Outlook?

    - by Mitch
    We're about to begin a refresh project for about 100 XP Pro laptops and have a concern with regards to the .NK2 file which holds cached email addresses(?). If possible we'd like to have USMT move/migrate this but I can't find anything that confirms that this happens automatically or has been done before. I see lots of manual processes but at this point I'm not sure that we can use that. Has anyone done this or seen this done? Perhaps you can point me to a resource that can give me an idea how its done? Any information would be appreciated. USMT seems to get a lot of the details but missing this part seems odd. Thanks in advance for any responses.

    Read the article

  • Is the cooling fan on a 2004 graphics card really necessary?

    - by Andrew
    I have an old GeForce 6600 card in my computer, circa 2004. Recently the fan has started playing up and making loud irritating noises. I've tried oiling it with no luck. This is the second fan I've put on the card, the stock one broke ages ago. Is a card this old really likely to need a cooling fan or can I remove it altogether? It has a decent heatsink on the chip but there's not a lot of airflow in that part of the box. Edit: I should add that I seem to remember most mid range graphics cards at the time I bought that didn't have fans (pretty sure they had heatsinks only), which is why I'm wondering.

    Read the article

  • Ubuntu from console/command-line/shell

    - by Xolve
    Earlies linux distros though required lot of manual work they were quite good to use from commandline. If the X-server didn't start or you just want a shell to work they all supported. Network was configured by init; sound was up and ready; new devices inserted would be configured and their configureation was placed in fstab. Also there were small scripts I found on many distros which on X used windows while on console they switched to ncurses. But now this all needs GUI with a desktop manager (KDE, GNOME) for the new paradigms :'-( require GUI (NetworkManger, hal etc.). So if on just command line you have to be root, looks like they believe only geeky admins need that, and need to edit config files or type big commands. Any way so that this is easy in Ubnubtu through shell again.

    Read the article

  • OpenGL render to texture causing edge artifacts

    - by mysticalOso
    This is my first post here so any help would be massively appreciated :) I'm using C++ with SDL and OpenGL 3.3 When rendering directly to screen I get the following result And when I render to texture I this happens Anti-aliasing is turned off for both. I'm guessing this has something to do with depth buffer accuracy but I've tried a lot of different methods to improve the result but, no success :( I'm currently using the following code to set up my FBO: GLuint frameBufferID; glGenFramebuffers(1, &frameBufferID); glBindFramebuffer(GL_FRAMEBUFFER, frameBufferID); glGenTextures(1, &coloursTextureID); glBindTexture(GL_TEXTURE_2D, coloursTextureID); glTexImage2D(GL_TEXTURE_2D,0,GL_RGB,SCREEN_WIDTH,SCREEN_HEIGHT,0,GL_RGB,GL_UNSIGNED_BYTE,NULL); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); //Depth buffer setup GLuint depthrenderbuffer; glGenRenderbuffers(1, &depthrenderbuffer); glBindRenderbuffer(GL_RENDERBUFFER, depthrenderbuffer); glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24, SCREEN_WIDTH,SCREEN_HEIGHT); glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthrenderbuffer); glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, coloursTextureID, 0); GLenum DrawBuffers[1] = {GL_COLOR_ATTACHMENT0}; glDrawBuffers(1, DrawBuffers); // if(glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) return false; Thank you so much for any help :)

    Read the article

  • Weird RAM Upgrade Experience

    - by Axel Isouard
    I have a laptop, HP EliteBook 8540p having originally 4GB RAM and I've recently bought Corsair Value Select SO-DIMM 16 Go (2x 8 Go) DDR3 1333 MHz. It fits the required RAM specifications perfectly, and once I've inserted them, the BIOS recognizes the memory correctly, but my linux Gentoo running on kernel 3.1.6 SMP x86_64 crashes immediately when I'm running an app which consumes a lot of memory. The laptop crashes as if there's no more battery left, when the memory reaches at least 6000MB ram. Windows 7 doesn't want to run anymore, it shows a blue screen with the IRQ LESS OR EQUAL error if I set 8GB, and it doesn't boot at all if I set 16GB. Is there something I could do to fix this please ?

    Read the article

  • 12.10 install overwrote my windows partition

    - by Niall C
    Recently decided to switch back to Ubuntu. I have a 3TB drive which was running win7. I had 3 partitions. c: for windows d: data e: data Have installed ubuntu before so 'thought' I knew what I was doing. I using netbootin I installed from a usb stick. I didn't choose the default options but I didn't choose the 'manual install' either. I can't remember what option I took but I figured at some stage it would tell me how it was going to partition the disk and at that stage I would see if it had recognised the NTFS partitions and I would be able to abort if it didn't. Unfortunately, it didn't and just went ahead and installed Ubuntu and made up it's own mind on how it was going to partition the disk. Usual story, the two NTFS data partitions weren't backed up. Is there anything I can do to retrieve the ntfs data? I'm currently trying out testdisk and I know I can use photodisk to retrieve certain file types but all the filenames will be lost and it's going to take a hell of a lot of time to rename them all. Any help or assistance would be more than gratefully accepted. Thanks in advance, Niall

    Read the article

  • How to configure Nginx to serve a variety of back-ends via multiple FCGI processes?

    - by Ben Horton
    I've seen a lot of tutorials showing one how to set up PHP/Python/Perl/RoR on nginx via various FCGI processes. None of the tutorials that I found show one how to serve multiple FCGI services off one server. How would one configure the stable nginx (nginx-0.7.64) to serve multiple FCGI processes (one for each of the above languages)? Example addresses for each FCGI process are as follows: 127.0.0.1:8080 - PHP 127.0.0.1:8081 - Python 127.0.0.1:8082 - Perl 127.0.0.1:8083 - Ruby on Rails An example configuration file that shows one how to implement multiple FCGI's off one server is really what I need. Perhaps others will benefit as well.

    Read the article

  • System in low graphics, deleted linux, grub rescue, can't access windows

    - by First timer
    So I'm pretty new to Ubuntu but I managed to install it with no big problems on both my desktop and netbook. When I installed it on my brother's netbook everything went horribly wrong and now I fear the system is close to beyond repair. The problem was first that it said it did not have any space left (seemed ridiculous since it had a lot). Then Ubuntu began booting into a "System is running in low graphics mode error" which I then tried to fix, using all the tips I could find in here but nothing helped. I think the graphics error and lack of space might have been related but I can't be sure. Finally I gave up repairing Ubuntu and went for a reinstall. Shouldn't have done that! I read that I should simply open Ubuntu through a live usb and choose GParted to delete the Linux partitions so I did and rebooted accordingly. Next, I was to install Ubuntu but now I am only given the option to wipe the whole disk for Ubuntu, not install along with windows 7. If I access GParted I can still see the ntfs partitions that hold windows 7 (there are 2: one labeled RECOVERY and another labeled OS and boot) so why can't I access them? Btw. the OS and boot has a little red mark with a warning that 1 cluster is referenced to multiple times, don't know what that means. If I boot without the live usb I am sent directly into a grub rescue "black screen of the computer will follow no orders". Please, I know that the easiest might be to simply wipe the whole thing clean but there are important files and programs on windows 7. Is there a way to just access windows? It is a dell inspiron 1018 mini netbook, so I have no cd input and no windows 7 installation cd.

    Read the article

  • Backup / Disaster Recovery, should I store RAR-compressed files?

    - by moraleida
    I'm in the process of recovering files from an accidentally formated Ext4 partition using Photorec. It had about 300Gb of data, of which I've already got hold of about 30Gb. So far, it seems to me that the recovery of RAR-compressed files has been much more successful than the recovery of individual uncompressed files and ZIP compressed files - in the sense that a lot of recovered files/zips were unreadable, and pretty much all of the RAR files were intact. Is there such a relation? Are RAR-compressed files really less prone to corruption and thus easier to recover?

    Read the article

  • Brightness going up to 100% on loading certain websites in Chrome

    - by picheto
    I'm using Google Chrome version 21.0.1180.89 on Ubuntu 12.04 and my laptop is a Sony VAIO VPCCW15FL (spec sheet). My video driver is the propietary "NVIDIA accelerated graphics driver (post-release updates)(version-current updates)". After installing Ubuntu, I discovered that neither the brightness control buttons (hardware) or the brightness slider (software) worked, and found out I could get the hardware buttons to work by installing the nvidiabl.deb package and oBacklight script. I'm using nvidiabl-dkms 0.77 and oBacklight 0.3.8. Still, the slider on the Ubuntu "Settings" does not work, but I don't care. There is an annoying thing happening when loading certain pages in Google Chrome: the brightness goes up to 100% when loading the webpage or when leaving it (closing the tab or typing a different URL on the omnibox). However, the "brightness tooltip" (that default brightness notification) remembers the position it was set to, so if I adjust the brightness with the HW buttons, the level gets adjusted relative to the value it was set to before "going 100%". I disabled the flash PPAPI plugin, but left the NPAPI plugin enabled, and the problem went away for pages with flash content. Still, the same thing happens when viewing HTML5 video, or when loading, for example, the Chrome Web Store or using the Scratchpad extension. I suppose it has to do with the rendering of certain elements using the GPU, but this is just a guess. This brightness thing does not happen when using Firefox 15.0 or any other application I have used yet. Does anybody know why this may be happening and what could I do to fix this without changing browser? Thanks a lot.

    Read the article

  • Is it worth moving from Microsoft tech to Linux, NodeJS & other open source frameworks to save money for a start-up?

    - by dormisher
    I am currently getting involved in a startup, I am the only developer involved at the moment, and the other guys are leaving all the tech decisions up to me at the moment. For my day job I work at a software house that uses Microsoft tech on a day to day basis, we utilise .NET, SqlServer, Windows Server etc. However, I realise that as a startup we need to keep costs down, and after having a brief look at the cost of hosting for Windows I was shocked to see some of the prices for a dedicated server. The cheapest I found was £100 a month. Also if the business needs to scale in the future and we end up needing multiple servers, we could end up shelling out £10's of £000's a year in SQL Server / Windows Server licenses etc. I then had a quick look at the price of Linux hosting for a dedicated server and saw the price was waaaaaay lower than windows hosting. One place was offering a machine with 2 cores for less than £20 a month. This got me thinking maybe the way to go is open source on Linux. As I write a lot of Javascript at work (I'm working on a single page backbone app at the moment), I thought maybe NodeJS and a web framework like Express would be cool to use. I then thought that instead of using SQL why not use an open source NoSQL database like MongoDB, which has great support on NodeJS? My only concern is that some of the work the application is going to do is going to be dynamically building images and various other image related stuff, i.e. stuff that is quite CPU heavy - so I'm thinking of maybe writing anything CPU heavy in C++ and consuming it as a module in Node. That's the background - but basically is Linux a good match for: Hosting a NodeJS/Express site? Compiling C++ node modules? Using a NoSQL DB like MongoDB? And is it a good idea to move to these unfamiliar technologies to save money?

    Read the article

  • Neural network input preprocessing

    - by TND
    It's clear that the effectiveness of a neural network depends strongly on the format you give it to work with. You want to preprocess it into the most convenient form you can algorithmically get to, so that the neural network doesn't have to account for that itself. I'm working on a little project that (surprise!) is going to be using neural networks. My future goal is to eventually use NEAT, which I'm really excited about. Anyway, one of my ideas involves moving entities in continuous 2D space, from a top-down perspective (this would be a really cool game AI). Of course, unless these guys are blind, they're going to be able to see the world around them. There's a lot of different ways this information could be fed into the network. One interesting but expensive way is to simply render a top-down "view" of things, with the entities as dots on the picture, and feed that in. I was hoping for something much simpler to use (at least at first), such as a list of the x (maybe 7 or so) nearest entities and their position in relative polar coordinates, orientation, health, etc., but I'm trying to think of the best way to do it. My first instinct was to order them by distance, which would inherently also train the neural network to consider those more "important". However, I was thinking- what if there's two entities that are nearly the same distance away? They could easily alternate indexes in that list, confusing the network. My question is, is there a better way of representing this? Essentially, the issue is the network needs a good way of keeping track of who's who, while knowing (by being inputted) relevant information about the list of entities it can see. Thanks!

    Read the article

  • Whats a good secure Windows FTP server?

    - by Keith Nicholas
    Whats a good FTP server? I have been running FileZilla, which seems okish. But I've noticed that a lot of people try to hack ftp servers and FileZilla only has very basic controls to prevent people from hacking. (so far no ones actually managed to get in... so thats good!) I was wondering if there were better options out there? Especially interested in recommendations from people who know they get targeted by hackers.

    Read the article

  • Google docs spreadsheet not loading

    - by Pythonista's Apprentice
    I have a Google spreadsheet with a lot of very important data and some scripts that where working well. At some point, the brownser crash and I reload the page. After that, I can't acess that (only that!) spreadsheet any more! I try it from other Google accounts but it doesn't work. All I get is the "Loading..." message in the brownser tab and nothing more (the loading process never completes). I also can't: copy the file or download it! Ie, I can lose all the information in my spreadsheet and also lose all my scripts! (I never think something like this could hapen with a Google Product) How can I solve this problem? Thanks in advance for any help!

    Read the article

  • How to stop domain users from installing any software?

    - by Chris
    Hi everyone, I was wondering which policies, etc I could setup to stop any installations from occurring in a server 2003 domain environment? I have 2003 RC2 and XP Pro clients. I guess the quick easy way is to make everyone guests, but this also blocks them from other things that they might need to do/access. I've seen a lot of ideas but they do not fully block everything. I know there probably isn't a fix all but would like to get as close as possible. Thank you all,

    Read the article

  • Skip CodedUI Tests, use Selenium for Web Automation

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/10/31/skip-codedui-tests-use-selenium-for-web-automation.aspxI recently joined a team that was using Agile Methodologies to create a new product. They have a working beta product after 10 or so 2 week sprints and already had UI’s that had changed several times as they went through iterations of their UI. As a result, the QA team was falling behind with automated tests and I was tasked to help them catch up and expand their tests. The project is a website. I heard many complaints about how hard it is to work with CodedUI (writing our own code, not relying on the recorder as we wanted re-usable and more maintainable code) then it took me 4+ hours to fix one issue. It was hard to traverse the key and debugging the objects with breakpoints… I said out loud “there has to be a better way or a framework the uses jQuery to run through the tests.” Plus it seemed really slow (wait… finding the object … wait… start putting in text…). Plus some tests would randomly fail on the test agents (using the test settings and an automated build, they are run on VMs using Microsoft test agents). Enough complaining. Selenium to the rescue (mostly). The lead QA guy decided to try it out and we haven’t turned back. We are now running tests in Chrome and Firefox and they run a lot faster. We had IE running to, but some of the tests were running fine locally, but hanging on the test agents. I’ll add some hints and lessons learned in a later post.

    Read the article

  • Pros and cons of PHP vs C,C++ as language in a programming interview ?

    - by DhruvPathak
    Hi All, Though this is a matter of personal choice and comfort. I would want your views on a situation like this. Programmer A has been working on PHP for some years, and has had prior experience in C.C++ during algorithm courses in university. The current fluency is good is PHP,but C,C++ can also be brushed up. So for interviews with major companies who put lot of emphasis on algorithms and data structures in programming interview e.g. binary trees, linked lists, arrays , strings . What should programmer A do ? Try to implement those things in PHP ( which is generally more suited for web development rather than programming contests/interviews ) or Or brush up the C,C++ skills and keep them as primary tool for tackling interview questions. What are advantages/ disadvantages of each language for an environment like programming contest or an interview ? Why would you recommend,not recommend Programmer A to participate in a contest like google code Jam/ ACM ICPC using PHP instead of C++ ? ( assuming PHP is allowed as a language there)

    Read the article

  • Using R on your Oracle Data Warehouse

    - by jean-pierre.dijcks
    Since it is Predictive Analytics World in our backyard (or are we San Francisco’s backyard…?) I figured it is well worth the time to dust of some old but important news. With big data (should we start calling it “any data analytics” instead?) being the buzz word and analytics the key operative goal, not moving data around is becoming more and more critical to the business users. Why? Because instead of spending time on moving data around into your next analytics server you should be running analytics on those CPUs. You could always do this with Oracle Data Mining within the Oracle Database. But a lot of folks want to leverage R as their main tool. Well, this article describes how you can do this, since 2010… As Casimir Saternos concludes in the article; “There is a growing awareness of the need to effectively analyze astronomical amounts of data, much of which is stored in Oracle databases. Statistics and modeling techniques are used to improve a wide variety of business functions. ODM accessed using the R language increases the value of your data by uncovering additional information. RODM is a powerful tool to enable your organization to make predictions, classify data, and create visualizations that maximize effectiveness and efficiencies.” Happy Analysis!

    Read the article

  • Ubuntu 12.04 won't shut down - stopping winbind daemon

    - by jan
    My Precise Pangolin sometimes won't shut down - the screen is black with text on it. Mostly last line says something like "stopping winbind deamon" (sometimes also virtualbox, which is above winbind daemon; edit: sometimes the last line says "running unattended updates") and it stays like this for about ten miutes. Then I usually hold the power button for 5s to shut it down. It's very unpredictable - sometimes the computer shuts down without problem and sometimes it hangs. I've tried many ways to shut it down: HW button, panel applet, sudo shutdown -h now, sudo poweroff, sudo halt, etc. even sudo reboot or restart from panel applet have this problem. Sometimes it works ok but every method named hung at least once on the same (damned) line. My specs: FUJITSU SIEMENS LIFEBOOK E8310, Intel Core2 Duo T7300 @ 2.00GHz, 3GB RAM, GPU: Mobile Intel(R) 965 Express Chipset Family Ubuntu 12.04.2 32bit, 3.5.0-41-generic kernel (but it did it on older kernels and 12.04.x systems too). Any ideas what should I try next? Thanks a lot! Jan

    Read the article

  • Job queueing in Toast Titanium 10?

    - by moonslug
    I have a bunch of .MP4 video files I'm burning to DVD-Video using Toast Titanium 10 on my MacBook Pro. Right now, I'm doing them one at a time. Because my computer is several years old, encoding video for a single DVD takes approximately six hours. I've discovered that it appears I can encode the video directly to a .toast format — however, I have yet to figure out if I can burn these directly to DVD. Also, I have quite a bit of video left to burn, and even that method would require me intervening manually to start a new encoding or burn job every six hours. Would it be possible to somehow queue up multiple DVD-Video encoding jobs at once, and have the computer work through them automatically? The actual writing to DVD disc doesn't take nearly as long, and if I had all my video encoded for me to begin with my job would be a lot quicker. Maybe this can be accomplished with a different piece of software?

    Read the article

  • Domain migration - 301 Redirect of all contentes of directory)

    - by Trufa
    Hi, I would like to know if it is possible to do the following considering that I would like to migrate domains. I have lets say: one.com/files/one.html one.com/files/two.php one.com/other/three.html one.com/other/four.doc one.com/other/subdirectory/five.doc I am migrating to two.com So I would like to make RESPECTIVE 301 redirects to the following: two.com/old/files/one.html two.com/old/files/two.php two.com/old/other/three.html two.com/old/other/four.doc two.com/old/other/subdirectory/five.doc I've tried with cPanel and although I come "close" with the redirects option I can't seem to make it happen. The folders are not much (10 -12) the file are a lot, and obviously impossible to make it manually. How would you proceed? Can this/ should this be done with regex from the .htaccess?? Can you direct all the elements of a subdirectory in the manner expressed above? I hope the question is clear enough, if not please ask for any clarification needed!! Thanks in advance!!

    Read the article

  • Wrong owner and group for files created under a samba shared directory

    - by agmao
    I am trying to make writing to a shared samba directory work. I got a very weird problem. Now the shared directory is writable from a client machine. But the files created under the samba share directory have weird owner and group names. I am writing to the shared directory as user mike under the client machine, but the file created always has user and group name as steve instead... Does anybody know why that would happen...? Another thing I just noticed is that on the samba server, the files have owner and user name as samba, which I created for samba clients. Thanks a lot

    Read the article

  • Using Shell32 to extract Mimes on Windows Server 2008

    - by Léon Pelletier
    In a desktop app, I'm using Shell32 to extract Mime infos and ID3 tags. I want to do the same in ASP.Net on server side with http posted files, plus fetch some other infos, but as a Windows Server 2008, there are not several media applications installed, so I wonder if it will still retrieve a lot of informations from the files. Will it be possible to do it from server side? If yes, will it fetch several Mimes without media app installed. If not, is there some Mime pack to get some file informations without the app being installed. [EDIT] I installed VLC player and Windows Media Player on the server, which provide all MP3s / movies infos (Duration, Artist, Album, Width, Height, etc.), but I don't know if this is a good practice.

    Read the article

  • Extracting httpdocs from Plesk Panel 9.5.4 Webserver backup file

    - by Paddington
    Good day, I am having problems manually extracting domains from Plesk 9.5 backup that was FTPed onto my back up server. I have followed the article http://kb.parallels.com/en/1757 using method 2. The problem is here: zcat DUMP_FILE.gz DUMP_FILE My backup file CP_1204131759.tar is a tar archive and zcat does not work with it. So I proceed to run the command: cat CP_1204131759.tar CP_1204131759. But when I try # cat CP_1204131759 | munpack I get an error that munpack did not find anything to read from standard input. I went on to extract the tar backup file using the xvf flags and got a lot of files (20) similar to these ones: CP_sapp-distrib.7686-0_1204131759.tgz CP_sapp-distrib.7686-35_1204131759.tgz CP_sapp-distrib.7686-6_1204131759.tgz How best can I extract the httpdocs of a domain from this server wide Plesk 9.5.4 backup?

    Read the article

  • Advanced command line editing for Windows?

    - by Ben Collins
    I'm developer who was "born and bred" on Linux and BSD systems, and I've become accustomed to having advanced tools for the console (posix shells like bash, for example). My career has taken a twist that means I'm working in a Windows environment most of the time, and the console capabilities are really poor by comparison. The traditional windows console environment is a complete joke, and even most of the third party attempts at improving things aren't a lot better. PowerShell is a huge step in the right direction, but the console applications themselves are still way behind where unix has been for 20 years. Does anyone know of a PowerShell console application that supports advanced command line editing like posix shells do? I'm particularly interested in emacs-mode editing, and I'd also like to be able to resize my window to an arbirary size, unlike the native console app that comes with Windows.

    Read the article

< Previous Page | 543 544 545 546 547 548 549 550 551 552 553 554  | Next Page >