Search Results

Search found 3004 results on 121 pages for 'safety critical'.

Page 64/121 | < Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >

  • 64 bit Windows 7 + 32 bit windows XP dual boot?

    - by Mick
    I have purchased an i7 based PC pre-installed with 64 bit windows 7 (home premium). Unfortunately some third party 32 bit software that I need to use is not working properly (see stackoverflow.com for details). I am now torn between the plan of installing windows XP 32 bit or making it dual boot. Which option do you think will give me the least problems? And if the answer is dual boot, then can you point me to a good guide for how to do it, preferably a guide specifically for my two OS's created in this order (i.e. 7x64 first). EDIT: the performance of my 32bit programs is critical so am concerned about any kind of 32bit XP "emulation".

    Read the article

  • Help with Corrupt version of IE8 on WinXPsp3

    - by Anon
    I've upgrade from IE6 to 7 to 8 and back down and back up, but still have critical issues in IE such as * cannot see any version info in "about internet explorer" * cannot run windows update * cannot load SharePoint pages (and other pages using ActiveX or IE-specific dhtml) I've also re-installed sp3, but still no luck. Also, also - I've changed security settings to be most permissive. Next step is blowing it all away and starting with windows7. Short of that, any suggestions are welcome. Thanks in advance.

    Read the article

  • Automatically make or update a copy real-time on another hard drive volume whenever files are saved to a particular folder

    - by mrblint
    Whenever I save or update a file to a particular designated folder on my C:\drive I would like to make or update a copy on my network-attached storage device, ideally saving the copy to the NAS as a version rather than overwriting a copy there, if possible. I have Windows 7 x64 Ultimate. Is there any feature built-in that can accomplish this? It has to be a real copy, not merely a pointer. I'm trying to achieve some redundancy for especially critical documents (in a variety of formats) that change frequently throughout the day. P.S. I am looking for folder-level granularity; I wouldn't want this to happen for every file on the C: volume.

    Read the article

  • Map path server with workgroup to other server with domain

    - by bzamfir
    I have the following situation, and I need some help to setup properly I have two VPS (hosted with the same provider, maximumasp). Server A is 2008R2, set with WORKGROUP, and server B is 2012, set with domain, maximumasp.local. On server A I have an old web app, which uses a special folder, c:\MyUploads, to store uploaded files. App is using this using an appSetting. It will be kept running for a while for safety / compatibility reasons. I installed a new version of the application on server B, running under AppPoolIdentity. Both instances of the app (A and B) will connect to the same database, so I need them to share also the access to upload folder c:\MyUploads. How should I setup the app on machine B to access the folder c:\MyUploads ? My idea is to share the folder as \A\MyUploads, and then map it to server B. But the problem is, I don't know to give Read/write to c:\MyUploads on machine A to IIS AppPools\ on machine B I was trying a test, and shared the c:\MyUploads to Everyone, ReadWrite. I was able to access it from machine B using \\MyUploads But when app from machine B tried to access a file, it gave error. Any idea how can I accomplish this? Some advice on best practices for such situation would be great. Thank you

    Read the article

  • Nagios: Is it possible to have multiple IPs for a host?

    - by Aknosis
    In our office we have dual WAN setup, if our cable connection drops we still get connectivity via our T1. The only issue is that our office network is no longer available on the same IP so all Nagios check go critical because they can't connect. What'd be awesome is if I could have Nagios try IP 1 by default but if for some reason its failing on that IP try IP 2. I doubt this is possible with a default install but I'm wondering if there is any add-ons or some other magic that could make this work?

    Read the article

  • Looking for Unix tool/script that, given an input path, will compress every batch of uncompressed 100MB text files into a single gzip file

    - by newToFlume
    I have a dump of thousands of small text files (1-5MB) large, each containing lines of text. I need to "batch" them up, so that each batch is of a fixed size - say 100MB, and compress that batch. Now that batch could be: A single file that is just a 'cat' of the contents of the individual text files, or Just the individual text files themselves Caveats: unix split -b will not work here as I need to keep lines of text intact. Using the lines option is a bit complicated as there is a large variance in the number of bytes in each line. The files need not be a fixed size strictly, as long as it's within 5% of the requested size The lines are critical, and should not be lost: I need to confirm that the input made its way to output without loss - what rolling checksum (something like CRC32, BUT better/"stronger" in face of collisions) A script should do nicely, but this seems like a task someone has done before, and it would be nice to see some code (preferably python or ruby) that does atleast something similar.

    Read the article

  • Configuring memcached for a particular scenario

    - by pradeepchhetri
    I have a web application which queries opentsdb server(which in backend using Hbase cluster) for the datapoints of different metrics and using dygraph javascript graphing library, I am plotting those metrics. Since getting all the datapoints of past one day from opentsdb for a particular metric is itself taking nearly 2 seconds, my application which is plotting nearly 25 metrics is becoming very slow. In order to reduce this latency, I am thinking of using memcached module of php5 for caching all the queries. But I have few questions regarding memcached. Is there any way I can configure memcache to keep on updating its cache in the background by running some command line queries after particular interval of time. Is there any way I can configure memcache to always reply for a query using cache instead of first updating its cache because my application just plots datapoints for past one day. Missing out some datapoints is not that critical.

    Read the article

  • How to get the Host value inside ~/.ssh/config

    - by iconoclast
    Within a ~/.ssh/config or ssh_config file, %h will give you the HostName value, but how do you get the Host ("alias") value? Why would I want to do that? Well, here's an example Host some_host_alias HostName 1.2.3.4 User my_user_name PasswordAuthentication no IdentityFile ~/.ssh/some_host_alias.rsa.id LocalCommand some_script.sh %h # <---- this is the critical line If I pass %h to the script, then it uses 1.2.3.4, which fails to give it all the options it needs to connect to that machine. I need to pass some_host_alias, but I can't find the % variable for that. (And: yes! I'm aware of the risk of recursion. That's solved inside the script.) UPDATE: Kenster pointed out that I could just hard-code the Host value as an argument to the script. Of course this will work in the example I gave, but it won't work if I'm using pattern matching for the Host.

    Read the article

  • RAID10 Without BBU, With UPS

    - by Richard
    My datacenter says that each rack has primary and backup power on each rack. I assume this means there is a UPS for each server. Therefore, do I have any need of getting a BBU for the following setup? Intel Cherry 520 SSD x 4 RAID 10 LSI-9260 with WRITEBACK CACHE ENABLED I have heard that without a BBU the data in the cache could be lost. Since my needs aren't mission-critical, I can afford to lose some data. But would the rest of the data on the HD be corrupted?

    Read the article

  • Hyper-V R2 as a VM on another virtual OS

    - by Tim
    I am trying to perform Microsoft Platform testing for a vendor application. The problem I have is that it requires the test be done on Windows Server 2008 as a VM on Hyper-V R2. Currently, I have access to a virtual server with just Hyper-V and also have access to an ESXi server. The crazy idea (which may just show how little I know about how virtualization technology works) is to install Hyper-V R2 as a VM onto one of these other servers. Then create a VM for Windows Server 2008 on this Hyper-V R2 VM. I can not just upgrade the current Hyper-V server as the VMs currently running would need to be taken off-line and are system critical (and I don't have rights to perform this upgrade). Has anyone tried this? Will this even work?

    Read the article

  • How to protect folder privacy against unethical network administrators? [closed]

    - by Trevor Trovalds
    I just need a technical solution for the sake of my group's shared passwords, projects, works, etc. safety. Our network has Active Directory with public/groups/users and NTFS permissions, under a Windows Server 2003 which will soon migrate to Windows Server 2008 R2. Our IT crowd is small, consisting of 2 DBAs, 4 designers, 6 developers (including me), 2 netadmins and (a lot of) tech supporters, everyone has local admin rights. Those 2 network admins weren't the ones who set the network up, they just took the lift recently when the previous ones quit. We usually find them laughing at private contents from users stored in the groups AD, sabotaging documents that don't match their personal tastes and, finally, this week we found out they stole a project we (developers and DBAs) were finishing and, long before, they presented it to the CEO as theirs without us knowing. I'm a systems analyst, and initially my group decided to store critical content, like shared passwords, inside encrypted .zip files. Unfortunately we couldn't do the same to the other hundreds of folders and files, which included the stolen project, because the zipping process would take too long for every update. We also tried an encrypted Subversion repository under SSL, but there are many dummies (~38 atm) involved in the projects that have trouble using TortoiseSVN when contributing, and very oftenly we had to fix messed up updates. Well, I think these two give the idea of what we've been trying to reach. So, is there a practical "individual" protection for our extensive data or my hope can already be euthanized? P.S.: Seriously, at the place where I live/work, political corruption gone the wildest, so denounce related options are likely impracticable. Yet both netadmins have strong "political bond" with the CEO and the President, hence their lousy behavior and our failed delation attempts.

    Read the article

  • Missing files when Windows 7 returns from hibernate w/ dual boot

    - by Arthur N
    I have a dual-boot setup with Ubuntu (lucid) and Windows 7. I have the Windows file system shared on Ubuntu through Samba. Occasionally, I am working on Windows and my machine will go into hibernate (i.e. when the battery level is critical). By default, my GRUB settings boot me into Ubuntu. So when I get back to my PC, sometimes I just hop into Ubuntu instead of going back to Windows. However, if I write any files to the Windows file system during that Ubuntu session, the next time I do go back to Windows (which resumes from hibernate), those files are missing. Obviously, the state of the actual file system and the hibernate snapshot become out of sync, and Windows chooses the hibernate snapshot, overriding any changes I may have made thru Ubuntu. For now, I've disabled the hibernate option in the Windows power settings, but is there any utility I can use to get back some of those missing files?

    Read the article

  • Which browser is the most secure? (research and practically based)

    - by wag2639
    I was wondering which browser is the most secure today, Firefox, Internet Explorer, Chrome, or Safari on a Windows machine with the user running as a Power User/Administrator account. This is not a question about which browser is the best because its the most usable, but more of a question if asked for security, which browser is the most secure given an everyday user's experience (JavaScript, Flash, Ads, etc). Also, would the choice for most secure change if the user was running as a restricted user? To clarify, I'm looking for an answer that's based in research on potential and common exploits and how long it takes for critical problems to be patched.

    Read the article

  • Missing files when Windows 7 returns from hibernate w/ dual boot

    - by Arthur N
    I have a dual-boot setup with Ubuntu (lucid) and Windows 7. I have the Windows file system shared on Ubuntu through Samba. Occasionally, I am working on Windows and my machine will go into hibernate (i.e. when the battery level is critical). By default, my GRUB settings boot me into Ubuntu. So when I get back to my PC, sometimes I just hop into Ubuntu instead of going back to Windows. However, if I write any files to the Windows file system during that Ubuntu session, the next time I do go back to Windows (which resumes from hibernate), those files are missing. Obviously, the state of the actual file system and the hibernate snapshot become out of sync, and Windows chooses the hibernate snapshot, overriding any changes I may have made thru Ubuntu. For now, I've disabled the hibernate option in the Windows power settings, but is there any utility I can use to get back some of those missing files?

    Read the article

  • Can software operation damage an SD card?

    - by Borek
    My SD card has a broken boot sector and the tools I've tried say that it's not repairable (I've tried TestDisk, DriveRestore Pro and Easeus Partition Recovery). The card was in my Android phone and at one point, it simply shut down and I had to reboot it. After I rebooted it, the SD card was not recognized and since then I've tried to recover it (I don't want to format the card as it contains some data I'd like not to lose although it's nothing critical). My question is, can some software error in Android, or a sudden crash of a system, damage the SD card? Or was it the other way around, the card first died and it brought the system down?

    Read the article

  • Dependency diagramming / mapping tool [closed]

    - by Lars
    I am looking for a tool that allows me to easily create and maintain dependency maps of our mission critical servers, apps, processes, etc. It needs to be intuitive and easy to work with and be able to generate diagrams that clearly show the dependencies graphically. What would be some good tools for this? I have looked at videos for AssetGen Sysmap and BluePrint from Pathwaysystems.com, and they both seem to fit my needs, but there has got to be more good systems like them that I can look at. I want to make sure I pick the best system for our needs (and limited budget).

    Read the article

  • What are some good asp.net shared hosting pre-sales questions?

    - by P a u l
    I'm not asking for any host recommendations, those are covered in other questions. What are some good pre sales questions for asp.net shared hosting? They never seem to answer all the questions in their feature lists. So far I have a few: dedicated application pool? sql server management studio supported? Is tunneling required? can I reset my application pool in the control panel? are php and perl fully supported as well? are subdomains supported, and will I need a routing script in the root or are they routed automatically? etc. Developers have a critical need for good hosting to stage applications. I think this is absolutely developer related and don't want the question on serverfault.

    Read the article

  • Where can someone store >100GB of pictures online? [closed]

    - by sbi
    A person who is not very computer-savvy needs to store 130GB of photos. The key parameters are: an non-negligible probability that the company selling the storage will be existing, and the data accessible, for at least five years data should be considered safe once uploaded reasonable terms of service: google drive reserving the right to literally do anything they want with their user's data is not acceptable; the possibility that the CIA might look at those pictures is not considered a threat easy to use from Windows, preferably as a drive no nerve-wracking limitations ("cannot upload 10GB/day" or "files 500MB" etc.) that serve no purpose other than pushing the user to the next-higher price plan some upgrade plan: there's currently 10-30GB of new photos per year, with a tendency to increase, which might bust a 150GB limit next January ability to somehow sort the pictures: currently they are sorted into folders, but something alike (tags) would be just as good, if easy enough to apply of course, the pricing is important (although there's a reason this is the last bullet; reasonable data safety is considered more important) Nice to have, but not necessary features would be: additional features related to photos (thumbnail generation, album sharing etc.) access from web and other platforms than Windows (smart phones) Let me stress this again: The person in need of that is able to copy pictures from the camera to the computer, can copy files in the explorer, and uses a web email service. That's about it, there's almost no understanding of what happens under the hood.

    Read the article

  • Will learning to use Fedora also teach me my way around Redhat (CentOS)?

    - by Matt Untsaiyi
    I want to dive into the open source world and start using a Linux distro while learning to program. I've looked over the options and it pretty much boils down to Fedora or CentOS. The reasoning behind it is I'm hoping to kill two birds with one stone... Redhat seems to be "the choice" for servers, so I figure as I learn to program, I can also learn my way around Linux... or Redhat more specifically... and get that under my belt too. I want to use Fedora, and be on the frontier of new software (since I'm not doing anything critical), but if it's completely different than Redhat I'd rather just use CentOS. So is it? Or can I use one and know the other?

    Read the article

  • No Cure for a Slow Computer?

    - by Marv
    I have a laptop with the following specs: 2.2gHz dual-core processor. 4gb of DDR2 Ram. 180gb HDD space. I have tried everything. I have reinstalled the OS. Installed Ubuntu with Lubuntu, LXDE, Gnome Classic, Unity 2D desktop. I have even tried downgrading to XP with all non-critical processes and services turned off. Even with the most stripped down version of Ubuntu it heats up and the fan starts churning. I'm out of ideas. I have tried everything. If you have any tips, please help. :'(

    Read the article

  • Minimal backup for Windows 7 system recovery [migrated]

    - by JIm
    There might not be an answer to this, but for a home Win7 system, what files/directories must be backed up to recover after a windows crash? I can reinstall software, and I keep data files elsewhere. When I use acronis home backup software to backup my "critical" files it seems to choose the entire partition. Updates are mostly browser cache files and the like. Or, after a crash, should I just reinstall windows. I dread the hours of windows updates that would require. Thanks.

    Read the article

  • Best Wiki Software For Product Support

    - by Zapnologica
    Good day, I am looking for a wiki system which we are going to use at work for a form of product support. We manufacture multiple devices and now i want to make a wiki which contains all sorts of relevant and helpful information on that product which the users can look at before trying to contact us for support? Now immediately the 1st option that comes to my mind in Media wiki but I dont just want to jump on the band wagon. I thought I would ask around first. It should preferable be free. But obviously if its really worth paying for then thats not the end of the world. And the uploading of content and media is not of much importance as the end users will simply be reading the information which the company has published. Another nice to have but is not critical is if it where to run on asp.net as we have Microsoft server running anyway.

    Read the article

  • Is it possible to put an 8000 series socket-F Opteron into a dual-socket motherboard?

    - by René Kåbis
    Exactly what it says on the Tin. I have a dual-socket, socket-F motherboard in which I am looking to put two high-end quad-core Opterons, but the 2000 series are nearly double the price of the 8000 series on eBay. Can I just drop in a pair of 8000 series processors and be done with it, or are there processor-critical motherboard features that would be present on a quad(+)-socket motherboard that don’t exist on a dual-socket motherboard? Please elaborate or link to resources that can explain this further, as I am not adverse to research (and I am interested in the technical issues involved) I am probably using the wrong search terms, as Google failed to return anything within the first few dozen results.

    Read the article

  • Windows VPS/Cloud Host Recommendations?

    - by user18937
    As Hosting.com are no longer offerring Windows VPS accounts, we are looking for a new US based provider. Looking for something that offers standard Windows IIS hosting for a Dotnetnuke portal site with SQL Server Express or Web. Basic managed services for OS updates as well as regular backups required. Good level of support and solid uptime is critical. Budget in the $150-$200 month range but flexible depeding on quality and services offerred. Anyone have any suggestions and good feedback they can share? Currently looking at Jodohost as an option but would like some other possibilities as their support can be suspect at times we have found. A cloud solution in same range would also be an option.

    Read the article

  • Can an SSD notify the hosting OS that its wear level is getting high?

    - by Tony_Henrich
    I read a lot about SSDs and I am interested in them for server use. My biggest concern is their reliability. A lot of writes shortens their life span. I can mitigate this problem if I can run some kind of diagnostics on a regular basis on the SSD or if the SSD can automatically warn the OS that its reliability is reaching a critical level. Think of this as S.M.A.R.T or software like SpinRite for SSDs. Does anything I mentioned exist now? Which kind/brand of SSD does this? I don't mind swapping out a tired SSD for a newer one once a while. I am pretty sure that SSDs life is calculated in years and not in few months? For me, the improved performance will pay for the SSD over and over. I am planning to use plenty of RAM as well.

    Read the article

< Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >