Search Results

Search found 18761 results on 751 pages for 'lot'.

Page 358/751 | < Previous Page | 354 355 356 357 358 359 360 361 362 363 364 365  | Next Page >

  • Backup / Disaster Recovery, should I store RAR-compressed files?

    - by moraleida
    I'm in the process of recovering files from an accidentally formated Ext4 partition using Photorec. It had about 300Gb of data, of which I've already got hold of about 30Gb. So far, it seems to me that the recovery of RAR-compressed files has been much more successful than the recovery of individual uncompressed files and ZIP compressed files - in the sense that a lot of recovered files/zips were unreadable, and pretty much all of the RAR files were intact. Is there such a relation? Are RAR-compressed files really less prone to corruption and thus easier to recover?

    Read the article

  • Will USMT 4.0 in MDT 2010 Move/Migrate the .NK2 File for Outlook?

    - by Mitch
    We're about to begin a refresh project for about 100 XP Pro laptops and have a concern with regards to the .NK2 file which holds cached email addresses(?). If possible we'd like to have USMT move/migrate this but I can't find anything that confirms that this happens automatically or has been done before. I see lots of manual processes but at this point I'm not sure that we can use that. Has anyone done this or seen this done? Perhaps you can point me to a resource that can give me an idea how its done? Any information would be appreciated. USMT seems to get a lot of the details but missing this part seems odd. Thanks in advance for any responses.

    Read the article

  • Reformatting and version control

    - by l0b0
    Code formatting matters. Even indentation matters. And consistency is more important than minor improvements. But projects usually don't have a clear, complete, verifiable and enforced style guide from day 1, and major improvements may arrive any day. Maybe you find that SELECT id, name, address FROM persons JOIN addresses ON persons.id = addresses.person_id; could be better written as / is better written than SELECT persons.id, persons.name, addresses.address FROM persons JOIN addresses ON persons.id = addresses.person_id; while working on adding more columns to the query. Maybe this is the most complex of all four queries in your code, or a trivial query among thousands. No matter how difficult the transition, you decide it's worth it. But how do you track code changes across major formatting changes? You could just give up and say "this is the point where we start again", or you could reformat all queries in the entire repository history. If you're using a distributed version control system like Git you can revert to the first commit ever, and reformat your way from there to the current state. But it's a lot of work, and everyone else would have to pause work (or be prepared for the mother of all merges) while it's going on. Is there a better way to change history which gives the best of all results: Same style in all commits Minimal merge work ? To clarify, this is not about best practices when starting the project, but rather what should be done when a large refactoring has been deemed a Good Thing™ but you still want a traceable history? Never rewriting history is great if it's the only way to ensure that your versions always work the same, but what about the developer benefits of a clean rewrite? Especially if you have ways (tests, syntax definitions or an identical binary after compilation) to ensure that the rewritten version works exactly the same way as the original?

    Read the article

  • 12.10 install overwrote my windows partition

    - by Niall C
    Recently decided to switch back to Ubuntu. I have a 3TB drive which was running win7. I had 3 partitions. c: for windows d: data e: data Have installed ubuntu before so 'thought' I knew what I was doing. I using netbootin I installed from a usb stick. I didn't choose the default options but I didn't choose the 'manual install' either. I can't remember what option I took but I figured at some stage it would tell me how it was going to partition the disk and at that stage I would see if it had recognised the NTFS partitions and I would be able to abort if it didn't. Unfortunately, it didn't and just went ahead and installed Ubuntu and made up it's own mind on how it was going to partition the disk. Usual story, the two NTFS data partitions weren't backed up. Is there anything I can do to retrieve the ntfs data? I'm currently trying out testdisk and I know I can use photodisk to retrieve certain file types but all the filenames will be lost and it's going to take a hell of a lot of time to rename them all. Any help or assistance would be more than gratefully accepted. Thanks in advance, Niall

    Read the article

  • Skip CodedUI Tests, use Selenium for Web Automation

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/10/31/skip-codedui-tests-use-selenium-for-web-automation.aspxI recently joined a team that was using Agile Methodologies to create a new product. They have a working beta product after 10 or so 2 week sprints and already had UI’s that had changed several times as they went through iterations of their UI. As a result, the QA team was falling behind with automated tests and I was tasked to help them catch up and expand their tests. The project is a website. I heard many complaints about how hard it is to work with CodedUI (writing our own code, not relying on the recorder as we wanted re-usable and more maintainable code) then it took me 4+ hours to fix one issue. It was hard to traverse the key and debugging the objects with breakpoints… I said out loud “there has to be a better way or a framework the uses jQuery to run through the tests.” Plus it seemed really slow (wait… finding the object … wait… start putting in text…). Plus some tests would randomly fail on the test agents (using the test settings and an automated build, they are run on VMs using Microsoft test agents). Enough complaining. Selenium to the rescue (mostly). The lead QA guy decided to try it out and we haven’t turned back. We are now running tests in Chrome and Firefox and they run a lot faster. We had IE running to, but some of the tests were running fine locally, but hanging on the test agents. I’ll add some hints and lessons learned in a later post.

    Read the article

  • Domain migration - 301 Redirect of all contentes of directory)

    - by Trufa
    Hi, I would like to know if it is possible to do the following considering that I would like to migrate domains. I have lets say: one.com/files/one.html one.com/files/two.php one.com/other/three.html one.com/other/four.doc one.com/other/subdirectory/five.doc I am migrating to two.com So I would like to make RESPECTIVE 301 redirects to the following: two.com/old/files/one.html two.com/old/files/two.php two.com/old/other/three.html two.com/old/other/four.doc two.com/old/other/subdirectory/five.doc I've tried with cPanel and although I come "close" with the redirects option I can't seem to make it happen. The folders are not much (10 -12) the file are a lot, and obviously impossible to make it manually. How would you proceed? Can this/ should this be done with regex from the .htaccess?? Can you direct all the elements of a subdirectory in the manner expressed above? I hope the question is clear enough, if not please ask for any clarification needed!! Thanks in advance!!

    Read the article

  • Is the cooling fan on a 2004 graphics card really necessary?

    - by Andrew
    I have an old GeForce 6600 card in my computer, circa 2004. Recently the fan has started playing up and making loud irritating noises. I've tried oiling it with no luck. This is the second fan I've put on the card, the stock one broke ages ago. Is a card this old really likely to need a cooling fan or can I remove it altogether? It has a decent heatsink on the chip but there's not a lot of airflow in that part of the box. Edit: I should add that I seem to remember most mid range graphics cards at the time I bought that didn't have fans (pretty sure they had heatsinks only), which is why I'm wondering.

    Read the article

  • Job queueing in Toast Titanium 10?

    - by moonslug
    I have a bunch of .MP4 video files I'm burning to DVD-Video using Toast Titanium 10 on my MacBook Pro. Right now, I'm doing them one at a time. Because my computer is several years old, encoding video for a single DVD takes approximately six hours. I've discovered that it appears I can encode the video directly to a .toast format — however, I have yet to figure out if I can burn these directly to DVD. Also, I have quite a bit of video left to burn, and even that method would require me intervening manually to start a new encoding or burn job every six hours. Would it be possible to somehow queue up multiple DVD-Video encoding jobs at once, and have the computer work through them automatically? The actual writing to DVD disc doesn't take nearly as long, and if I had all my video encoded for me to begin with my job would be a lot quicker. Maybe this can be accomplished with a different piece of software?

    Read the article

  • Ubuntu from console/command-line/shell

    - by Xolve
    Earlies linux distros though required lot of manual work they were quite good to use from commandline. If the X-server didn't start or you just want a shell to work they all supported. Network was configured by init; sound was up and ready; new devices inserted would be configured and their configureation was placed in fstab. Also there were small scripts I found on many distros which on X used windows while on console they switched to ncurses. But now this all needs GUI with a desktop manager (KDE, GNOME) for the new paradigms :'-( require GUI (NetworkManger, hal etc.). So if on just command line you have to be root, looks like they believe only geeky admins need that, and need to edit config files or type big commands. Any way so that this is easy in Ubnubtu through shell again.

    Read the article

  • Ubuntu 12.04 won't shut down - stopping winbind daemon

    - by jan
    My Precise Pangolin sometimes won't shut down - the screen is black with text on it. Mostly last line says something like "stopping winbind deamon" (sometimes also virtualbox, which is above winbind daemon; edit: sometimes the last line says "running unattended updates") and it stays like this for about ten miutes. Then I usually hold the power button for 5s to shut it down. It's very unpredictable - sometimes the computer shuts down without problem and sometimes it hangs. I've tried many ways to shut it down: HW button, panel applet, sudo shutdown -h now, sudo poweroff, sudo halt, etc. even sudo reboot or restart from panel applet have this problem. Sometimes it works ok but every method named hung at least once on the same (damned) line. My specs: FUJITSU SIEMENS LIFEBOOK E8310, Intel Core2 Duo T7300 @ 2.00GHz, 3GB RAM, GPU: Mobile Intel(R) 965 Express Chipset Family Ubuntu 12.04.2 32bit, 3.5.0-41-generic kernel (but it did it on older kernels and 12.04.x systems too). Any ideas what should I try next? Thanks a lot! Jan

    Read the article

  • Brightness going up to 100% on loading certain websites in Chrome

    - by picheto
    I'm using Google Chrome version 21.0.1180.89 on Ubuntu 12.04 and my laptop is a Sony VAIO VPCCW15FL (spec sheet). My video driver is the propietary "NVIDIA accelerated graphics driver (post-release updates)(version-current updates)". After installing Ubuntu, I discovered that neither the brightness control buttons (hardware) or the brightness slider (software) worked, and found out I could get the hardware buttons to work by installing the nvidiabl.deb package and oBacklight script. I'm using nvidiabl-dkms 0.77 and oBacklight 0.3.8. Still, the slider on the Ubuntu "Settings" does not work, but I don't care. There is an annoying thing happening when loading certain pages in Google Chrome: the brightness goes up to 100% when loading the webpage or when leaving it (closing the tab or typing a different URL on the omnibox). However, the "brightness tooltip" (that default brightness notification) remembers the position it was set to, so if I adjust the brightness with the HW buttons, the level gets adjusted relative to the value it was set to before "going 100%". I disabled the flash PPAPI plugin, but left the NPAPI plugin enabled, and the problem went away for pages with flash content. Still, the same thing happens when viewing HTML5 video, or when loading, for example, the Chrome Web Store or using the Scratchpad extension. I suppose it has to do with the rendering of certain elements using the GPU, but this is just a guess. This brightness thing does not happen when using Firefox 15.0 or any other application I have used yet. Does anybody know why this may be happening and what could I do to fix this without changing browser? Thanks a lot.

    Read the article

  • Whats a good secure Windows FTP server?

    - by Keith Nicholas
    Whats a good FTP server? I have been running FileZilla, which seems okish. But I've noticed that a lot of people try to hack ftp servers and FileZilla only has very basic controls to prevent people from hacking. (so far no ones actually managed to get in... so thats good!) I was wondering if there were better options out there? Especially interested in recommendations from people who know they get targeted by hackers.

    Read the article

  • How to stop domain users from installing any software?

    - by Chris
    Hi everyone, I was wondering which policies, etc I could setup to stop any installations from occurring in a server 2003 domain environment? I have 2003 RC2 and XP Pro clients. I guess the quick easy way is to make everyone guests, but this also blocks them from other things that they might need to do/access. I've seen a lot of ideas but they do not fully block everything. I know there probably isn't a fix all but would like to get as close as possible. Thank you all,

    Read the article

  • How can I get SLI working with 295.40?

    - by Steve
    I've been doing a lot of googling these last few hours and I'm not having much luck. Perhaps I don't know exactly what I am looking for. I just recently installed Ubuntu 12.04LTS x86_64. Looks beautiful! I have two GTX470's in SLI, and I am finally migrating my desktop over given the hopeful gaming support as of late. My laptop has been enjoying multiple distros of Ubuntu for a couple years now. However, new problems come with unexplored territory, here. At first, I only had one working monitor of my two. Over on nvidia-xconfig I fixed that, but the only solution that actually worked was twinview. Just recently I read here that twinview is not compatible with SLI. Sweet. When I try to tell it, oh hey, use a separate XScreen, configure it the way I want it, click save to configuration file, enter my password, then a sudo restart lightdm, it's broken. One screen blacks or whites out (Couldn't tell you the specific conditions for each, I'm dubious at this point,) and I get this huge error dialogue box upon login. Something about incompatible resolutions if I remember right. Though I am sure I set the resolutions for each screen correctly. Anyway, when I try to enable SLI (sudo nvidia-xconfig --sli=On) despite the fact it hates twinview, unity breaks. The sidebar is there, but only one screen works, the mouse is trapped running along the left edge of it, and the background of the sidebar is a solid blue. Anyway, this ended up being entirely too verbose, I'm sorry, but could anyone part some wisdom please? It would be appreciated!

    Read the article

  • What's the canonical process for backing up a website?

    - by Walkerneo
    This is going to sound terrible, but bear with me. I currently have a cron job that does a mysql dump, a git add all and commit, and a git push to bitbucket. I set this up almost a year ago, when I didn't know much about git, backups, and general web development and administration. I haven't had the time to fix this and do it properly, but the repo has now grown quite big from accumulating large temporary files from my forum, so now I have to do something and I want to do it properly this time around. What processes do semi-large websites and personal site admins use for backing up server content? Based on what I've learned since I set this up, what I'm currently think of doing is: Making changes on a development domain and committing the code frequently Archiving the entire site after a successful deployment from the development domain Having automatic daily database and user-content backups. I still like the idea of backing up sqldumps with git, though. I know git isn't a backup tool and that this is beyond its purpose, but the textual queries that are exported would be easily managed by git and would save a lot of space in archives.

    Read the article

  • Best way/format to convert a PDF to be able to read on android?

    - by stirredo
    I have a pdf ebook, a Technical book that conatins a lot of code. Reading it as pdf is slow plus reflowing text in the pdf reader removes all the formatting. I tried converting it to epub using calibre but that too removes all the formatting. The only pleasant experience I have had reading a technical book on my phone is as chm but I can't find anything to convert pdf to chm(Found a couple of softwares but that didn't do a good job). Has anyone had successfully experience in reading a technical pdf ebook on a android phone?

    Read the article

  • xubuntu 12.04 screen regularly stops refreshing, refreshing resumes after un-/re-maximizing a window

    - by user68477
    My screen frequently stops completely refreshing. I can make it resume refreshing by un-maximizing/re-maximize a window or by switching workspace (the un-/re-maximizing works every time. Switching workspace sometimes has to be done a couple of times). The immediate impression is that the system is frozen: there is apparently no reaction to anything I do but interestingly window title bar will change, if I switch application with (i.e alt+tab or browse through folders) I saw an identical issue in ubuntu 10.04, though a lot less frequent, I never saw this in ubuntu 12.04 (which I have been using the last 4-5 months). After switching to Xubuntu I'm seeing this again and more frequently. The specific reason I'm not sure this is a bug: I installed gnome-control-center which dragged in tons of packages. This was while trying to fix dual-screen setup. I believe the issue surfaced after this. I later meticulously removed every package from this batch (purge) in the hope that every setting would also be removed. But the issue has persisted. Another issue happened at the same time, it may be totally unrelated but it feels as if it is the same basic issue: the screen resolution of the greeter became less than the expected 1680x1050 and often after login there's just a blank and totally unresponsive wallpaper without panel so I have to force reboot. When the login is successful it's very clear that it works hard to determine the correct resolution which is achieved after a few blink to black screens. My questions: 1) Is this a settings issue or a bug? 2) How do I begin to research the issue - could I perhaps some way reset xubuntu/xfce to default. 3) If this is a bug where would be the most appropriate place report this? System: Thinkpad T500 ati radeon HD 3650 $ fglrxinfo display: :0.0 screen: 0 OpenGL vendor string: Advanced Micro Devices, Inc. OpenGL renderer string: ATI Mobility Radeon HD 3650 OpenGL version string: 3.3.11627 Compatibility Profile Context $ uname -a Linux srvname 3.2.0-32-generic #51-Ubuntu SMP Wed Sep 26 21:33:09 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux xfce 4.10 Compiz 0.9.7.8

    Read the article

  • RESTful applications logic and cross resource operations

    - by Gaz_Edge
    I have an RESTful api that allows my users to receive enquiries about their business e.g. 'I would like to book service x on date y. Is this available?'. The api saves this information as a resource to the following URI users/{userId}/enquiries/{enquiryId} The information shown when this resource is retrieved are the standard sort of things you'd expect from an enquiry - email, first_name, last_name, address, message The api also allows customers to be created for a user. The customer has a login and password and also a profile. The following URIs expose these two resources PUT users/{userId}/customers/{customerId} PUT users/{userId}/customers/{customerId}/profile The problem I am having is that I would like to have the ability to allow users to create a customer from an enquiry. For example, the user is able to offer their service on the date requested and will then want to setup a customer with login details etc to allow them to manage the rest of the process. The obvious answer would be to use a URI like users/{userId}/enquiries/{enquiryId}/convert-to-client The problem with this is is that it somewhat goes against a lot of what I've been reading about how to implement REST (specifically from the book Restful Web Services which suggests that URIs should point to resources not operations on resources). The other option would be to get the client application (i.e. the code that calls the api) to handle some of this application logic. This doesn't quite feel right to me. I have implemented in my design that the client app is fairly dumb. It knows just enough to display the results from the API, and does not contain any application logic. Would be great to hear what others views are on the best way of setting this up Am I wrong to have no application logic in the client app? How would I perform this operation purely in the REST api?

    Read the article

  • Blogger Blog Takes Ages to Load after Custom Domain Redirection

    - by abhisek
    I recently bought a custom domain for a blogger blog (technabled.com) I have for sometime now. I followed the instructions on blogger's documentation. I added A-name records and CNAME records with my DNS provider. But, now, some strange problems are cropping up. If I connect to my broadband network and then ping technabled.com, it times out. Then, if I visit the webpage, which takes almost one and half minutes to load, and then if I ping technabled.com, it shows expected result. This is not just me. I asked some of the regular readers, who reported the same issue. As a result of this, I am losing a lot of visits. What is stranger is that the subsequent visits to the blog is faster. I have checked with a few online services to test the performance. WebPageTest seems to say the same thing: http://www.webpagetest.org/result/110117_1N_7PE/ (please see the First View / Repeat View time) Also, the pagespeed score is not that bad. So I am ruling out other possibilities. I am at a loss as to what I should do to find a solution. Help is much appreciated. :)

    Read the article

  • Using Shell32 to extract Mimes on Windows Server 2008

    - by Léon Pelletier
    In a desktop app, I'm using Shell32 to extract Mime infos and ID3 tags. I want to do the same in ASP.Net on server side with http posted files, plus fetch some other infos, but as a Windows Server 2008, there are not several media applications installed, so I wonder if it will still retrieve a lot of informations from the files. Will it be possible to do it from server side? If yes, will it fetch several Mimes without media app installed. If not, is there some Mime pack to get some file informations without the app being installed. [EDIT] I installed VLC player and Windows Media Player on the server, which provide all MP3s / movies infos (Duration, Artist, Album, Width, Height, etc.), but I don't know if this is a good practice.

    Read the article

  • Does it make sense to write build scripts in C++?

    - by Klaim
    I'm using CMake to generate my projects IDE/makefiles, but I still need to call custom "scripts" to manipulate my compiled files or even generate code. In previous projects I've been using Python and it was OK, but now I'm having serious trouble managing a lot of dependencies in two very big projects I'm working on so I want to minimize the dependencies everywhere. Someone suggested to me to use C++ to write my build scripts instead of adding a language dependency just for that. The projects themeselves already use C++ so there are several advantages that I can see: to build the whole project, only a C++ compiler and CMake would be necessary, nothing else (all the other dependencies are C or C++); C++ type safety (when using modern C++) makes everything easier to get "correct"; it's also the language I know the better so I'm more at ease with it even if I'm able to write some good Python code; potential gain in execution speed (but i don't think it will really be perceptible); However, I think there might be some drawbacks and I'm not sure of the real impact as I didn't try yet: might be longer to write the code (that said I'm not sure because I'm efficient enough in C++ to write something that work quickly, so maybe for this system it wouldn't be so long to write) (compilation time shouldn't be a problem for this case); I must assume that all the text files I'll read as input are in UTF-8, I'm not sure it can be easilly checked at runtime in C++ and the language will not check it for you; libraries in C++ are harder to manage than in scripting languages; I lack experience and forsight so maybe I'm missing advantages and drawbacks. So the question is: does it make sense to use C++ for this? do you have experiences to report and do you see advantages and disadvantages that might be important?

    Read the article

  • Failing RAM, or something else?

    - by Thanatos
    I have a IBM Thinkpad T43, currently running Windows XP. Programs were crashing, XP was blue-screen-of-deathing, (more than usual) - it was basically unusable, but I couldn't get any informative error out of XP. I booted Ubuntu off a thumbdrive, which made it to the desktop, but as soon as I started to try to do anything, X segfaulted, along with several other services, followed quickly by kernel warnings and a kernel panic. I'm currently running Memtest86+ on this machine, which is spitting out numerous errors. (16k over 3 passes, and counting) The failing areas are numerous, and look something like this: 0001055da4 - XX.X MB, etc. The addresses that fail seem to cluster around 0-20 MB, 250MB, and, more rarely, 750MB, 1000MB, and 1200MB. However, a lot (but not all) of the failing addresses that I've seen end in XXXXXXX?da4 where the ? is a 1 or a 5. The machine has two sticks of RAM, one 512MB, one 1024 MB, the 512MB mapped to the lower addresses, the 1024 MB stick following. Is this indeed RAM failure, or should I consider other things before purchasing more RAM?

    Read the article

  • Advanced command line editing for Windows?

    - by Ben Collins
    I'm developer who was "born and bred" on Linux and BSD systems, and I've become accustomed to having advanced tools for the console (posix shells like bash, for example). My career has taken a twist that means I'm working in a Windows environment most of the time, and the console capabilities are really poor by comparison. The traditional windows console environment is a complete joke, and even most of the third party attempts at improving things aren't a lot better. PowerShell is a huge step in the right direction, but the console applications themselves are still way behind where unix has been for 20 years. Does anyone know of a PowerShell console application that supports advanced command line editing like posix shells do? I'm particularly interested in emacs-mode editing, and I'd also like to be able to resize my window to an arbirary size, unlike the native console app that comes with Windows.

    Read the article

  • How to make a iOS plugin for Unity3d

    - by DannoEterno
    I've passed last 2 days reading articles and book for understand how can i make a plugin for iOS in Unity. Basically i need just a demo for understand how it work. For now i've tried to make this process (with really poor luck): I've started a new project in Unity and writed a simple script using UnityEngine; using System.Collections; using System; using System.Runtime.InteropServices; public class CallPlugin : MonoBehaviour { [DllImport ("__Internal")] private static extern int test(); void Start () { Debug.Log(test()); } } Then i've created a project in Xcode with this simple script: extern "C"{ int test() { int che = 5; return che; } } Then i've tried: to put the .mm and .h in the Assets/Plugins/iOS = nothing to build the unity project and than add the .h and .mm in the Xcode project = nothing In Unity i will always get the EntryPointNotFoundException, so unity see the file but is unable to reach the method. The problem is... how?! :) Maybe i miss something or i've done something wrong? Thanks a lot for every help that you can give me :)

    Read the article

  • How to configure Nginx to serve a variety of back-ends via multiple FCGI processes?

    - by Ben Horton
    I've seen a lot of tutorials showing one how to set up PHP/Python/Perl/RoR on nginx via various FCGI processes. None of the tutorials that I found show one how to serve multiple FCGI services off one server. How would one configure the stable nginx (nginx-0.7.64) to serve multiple FCGI processes (one for each of the above languages)? Example addresses for each FCGI process are as follows: 127.0.0.1:8080 - PHP 127.0.0.1:8081 - Python 127.0.0.1:8082 - Perl 127.0.0.1:8083 - Ruby on Rails An example configuration file that shows one how to implement multiple FCGI's off one server is really what I need. Perhaps others will benefit as well.

    Read the article

< Previous Page | 354 355 356 357 358 359 360 361 362 363 364 365  | Next Page >