Search Results

Search found 2049 results on 82 pages for 'nw tech'.

Page 68/82 | < Previous Page | 64 65 66 67 68 69 70 71 72 73 74 75  | Next Page >

  • Tripwire help Required

    - by ramaperumal
    I have created the policy file in Tripwire and also I have created the rules as well mentioned below: /opt/jboss/server/gis/conf -> $(SEC_CONFIG) +aipm +c+g+a+i+s+t+u+l+M; /usr/local/gtech/eseries/ -> $(SEC_CONFIG) +a+c+g+i+s+t+u+l+M ; After running the integrity check the output should be a(Access timestamp),c (Inode timestamp (create/modify),g (File owner's group ID),i (Inode number),s (File size),t (time stamp),u (File owner's user ID),l(File is increasing in size (a "growing file"),M (MD5 hash value). I am getting the output as below: [root@xxsi1242 tripwire]# tripwire --check Parsing policy file: /etc/tripwire/tw.pol *** Processing Unix File System *** Performing integrity check... Wrote report file: /var/lib/tripwire/report/xxsi1242.gtk.gtech.com-20131106-053812.twr Open Source Tripwire(R) 2.4.1 Integrity Check Report Report generated by: root Report created on: Wed 06 Nov 2013 05:38:12 AM EST Database last updated on: Wed 06 Nov 2013 05:31:17 AM EST =============================================================================== Report Summary: =============================================================================== Host name: xxsi1242.gtk.gtech.com Host IP address: 156.24.65.171 Host ID: None Policy file used: /etc/tripwire/tw.pol Configuration file used: /etc/tripwire/tw.cfg Database file used: /var/lib/tripwire/xxsi1242.gtk.gtech.com.twd Command line used: tripwire --check =============================================================================== Rule Summary: =============================================================================== ------------------------------------------------------------------------------- Section: Unix File System ------------------------------------------------------------------------------- Rule Name Severity Level Added Removed Modified --------- -------------- ----- ------- -------- Invariant Directories 66 0 0 0 Temporary directories 33 0 0 0 * Tripwire Data Files 100 0 0 1 Tech Stack 100 0 0 0 User binaries 66 0 0 0 Tripwire Binaries 100 0 0 0 * CLPS bins 100 0 0 2 CLPS Configuration files 100 0 0 0 ESCommon 100 0 0 0 Shell Binaries 100 0 0 0 OS executables and libraries 100 0 0 0 Security Control 100 0 0 0 ESCommon Configuration 100 0 0 0 (/etc/gtech/escommon) Total objects scanned: 12358 Total violations found: 3 =============================================================================== Object Summary: =============================================================================== ------------------------------------------------------------------------------- # Section: Unix File System ------------------------------------------------------------------------------- ------------------------------------------------------------------------------- Rule Name: Tripwire Data Files (/etc/tripwire/tw.pol) Severity Level: 100 ------------------------------------------------------------------------------- Modified: "/etc/tripwire/tw.pol" ------------------------------------------------------------------------------- Rule Name: CLPS bins (/opt/jboss/server) Severity Level: 100 ------------------------------------------------------------------------------- Modified: "/opt/jboss/server/esapps1/data/hypersonic/localDB.lck" "/opt/jboss/server/gis/data/hypersonic/localDB.lck" =============================================================================== Error Report: =============================================================================== No Errors ------------------------------------------------------------------------------- *** End of report *** Note: In the output I only am getting the files which are modified. I need the detail output for this. But unfortunately I am not getting what I expected. Please help me to proced further.

    Read the article

  • Windows 8 Doesn't Shutdown Properly With Fast Start-Up Enabled

    - by Patrick
    While Fast start-up is enabled, on turning the computer off (shutdown) the computer idles for about 5min after logging out/screen turning off. It then turns off. On returning into Windows I receive the error message saying Windows didn't shut down properly. Hibernate works fine, and I am told this shouldn't be the case - If one doesn't work, neither should. It works when both Fast start-up is enabled and disabled, as does restart and sleep. Windows is installed under UEFI. The UEFI ultra fast boot option for my motherboard cannot be enabled as my GPU doesn't support some UEFI GOP tech. As far as I know, not related to windows fast start-up, but thought it was worth mentioning. To clarify, if this: http://www.eightforums.com/tutorials/6320-fast-startup-turn-off-windows-8-a.html is enabled, the computer does not shut down properly. EDIT: Some more information on the matter: Formatting didn't fix the issue. Still fails regardless of drivers installed. Hardware was purchased ~6months ago. Running a good SSD. Event viewer Always these two messages in close succession: Error (event ID 6008): The previous system shutdown at 7:45:21 PM on ?27/?10/?2012 was unexpected. Critical (kernel power, event ID 41): The system has rebooted without cleanly shutting down first. This error could be caused if the system stopped responding, crashed, or lost power unexpectedly. Upon installing WPT as suggested below to figure out what was happening during shutdown, and running the cmd xbootmgr -trace shutdown -noPrepReboot -traceFlags BASE+CSWITCH+DRIVERS+POWER -resultPath C:\TEMP Windows fast start-up is now working consistently. Still works upon uninstalling WPT. This is the only change to occur on the computer. Nothing else has bee installed/uninstalled, no Windows Updates, nothing. Windows fast start-up did not work prior to installing WPT and running the cmd (made sure I tested).

    Read the article

  • Dell Dimension 8400 Startup error

    - by Michael
    Hello all, thanks first for taking the time to read this and possibly help me...... now I am pretty decent of a computer tech...but not enough. I am having an issue with my computer which is running windows xp and as I mentioned it is a dell Dimension 8400. as soon as I power the system up the fan goes into hyper drive (spins like crazy and is very loud) then the start up screen with dell comes up and the loading bar gets stuck on the process of "Bios Revision A00" and never loads beyound that. I have read alot about it and think that the main problem was that it can not locate the file (which does have an updated version) I think it is A09. I can not enter safe mode, Bios mode or anything. I do have the file on my other computer and I was wondering if there is a way that I can use a usb flash drive (as I have read on other sites) to create a bootable MS-Dos diskette but I am failing at creating as such....is this possible? or is there anything else I can do? I tried to remove the battery from the system for about 10 minutes while it was completely unplug and tried then to reboot it and go into the bios menu but the same thing keeps happening....can anyone help me :-(

    Read the article

  • Does a mini PCIe SSD fit into a Acer Aspire One?

    - by Narcolapser
    Question: What, if any, mini PCIe SSDs fit into the mini PCIe slot of the Acer Aspire one AOD250? Info: I have an Aspire One and I've been considering loading it with an SSD. The mini PCIe drives fascinate me and so I want to try that approach. Also they tend to be cheaper and not much slower. (at least not on Read time which matters more for a netbook) But I've heard that some times computers don't support certain mini PCIe cards. And I was wondering if anyone knew about the Aspire One? I tried asking Acer tech support, but they didn't know jack and spent the whole time informing that I would have to support my Ubuntu install on my own, which I was. Anyway. Rant Aside, I'm looking at this drive: http://www.newegg.com/Product/Product.aspx?Item=N82E16820183252 It states it is exclusively for the Eee PC. now does that mean It was designed for the Eee PC but will work in my netbook. or is something going to go wrong? (like right now my concern is it physically not fitting.) Any information would be appreciated. o7

    Read the article

  • Recommendation on remote access setup for accessing customer systems

    - by gregmac
    I'm looking for a product recommendation (open or commercial) that will allow remote access to customer sites for tech support purposes. We need to be able to gain access to help troubleshoot problems on servers. Currently end up using anything from RDP on public IP, to various VPNs that clients happen to have, to webex-type sessions that require lots of interaction from both sides to get things working. This often means a problem that could take 10 minutes to solve takes an extra 30+ minutes messing around trying to get a connection up. There are multiple customer sites, which should NOT have access to each other. At each site, there is anywhere from 1 to 8 servers (Windows 2003 or 2008) that need to be accessed. Support connection to machines even if they're behind a firewall/router with no public IP Be able to selectively allow/deny access from customer site. Customer site should not be able to connect outbound to anywhere else (our systems, or other customer sites) Support multiple users from our end If not a VPN connection (where RDP could be used over top), should support: Remote desktop access, including copy/paste File transfers Preferably would have some way to list all remote systems, showing online/offline. Anyone have any suggestions?

    Read the article

  • Massive Memory Leaks?

    - by Mads
    Hi, I seem to have huge memory leaks, which are confusing me. I'm running fusion 3.1 / Windows 7 on Snow Leopard. It's a clean install with all upgrades applied. I've given fusion 8GB on a 14GB machine. I've installed VS2008 & Eclipse in Windows 7. Nothing unusual. Inside Task Manager in Windows 7, my memory footprint stays reasonable, at <2GB. But in OSX, Activity Monitor shows the footprint of vmware-vmx to be much larger. It starts at 2 GB, which seems fine, but whenever I'm actually doing anything in Windows, vmware-vmx's footprint grows at a few MB per second. After 20 mins or so it's using ~10GB and everything grinds to a halt. Throughout this, Task Manager still says I'm only using 2GB. And whatever I do in windows seems to increase vmware-vmx's memory footprint. Even closing down an application seems to make it go up. So is this par for the course in fusion? I was previously using parallels 3 / Vista under Leopard, and it worked fine. I'd assumed my new fusion config would work better, but this makes it completely unusable. (And apparently I can't even ask tech support unless I buy a support package...) Any advice much appreciated. Thanks

    Read the article

  • Dell XPS 15 L502X hard drive Partition

    - by Mohan Gajula
    I have a situation here. I got my new Dell XPS 15 Laptop. The configuration of hard drive is as below : Volume 1: (OEM Partition): 133MB Volume 2: OS (C:): 685.25 GB Volume 3: Recovery : 13.25 GB Now, I am trying to re-partition my C Drive to have a C: drive with 100 GB and a new drive with 585 GB. Earlier, I tried using the Windows 7 Disk Management to shrink and extend the volume. That lead to the OS and hard drive not working. Dell Tech support tried to fix the issue, but they were not able to fix the issue online. Later a Dell Technician arrived my place, and replaced the hard drive with a new hard drive. Please help me re-partition the C: Drive with 100 GB, and new D drive with 585 GB. I don't want to lose my Recovery Partition. SOLUTION As Suggested by KCotreau below , I have done exactly. I have resized the C drive to 100 GB. And then applied the changes. Windows got restarted. On the boot screen, the partition was taking place. It took around 30 mins ( approx. ). Once after restart, I can see my C drive is 100 GB. Now opened the Easeus again. And created a new partition for the free space ( 585 GB ) this took 10 seconds to create. Here goes the screenshot after partitioning. Thanks to KCotreau. You are amazing.

    Read the article

  • Why would my wireless cut in and out every minute or so?

    - by Strilanc
    I've been having problems with my wireless. I moved to a new apartment, and the wireless seems incredibly unreliable. Sometimes it will be stable for hours until, all of a sudden, it starts cutting in and out. I'll get 30-90 seconds of normal behavior, then 5-30 seconds of nothing, then repeat. Sometimes the connection will stop working entirely, until I power-cycle the router. It is extremely, extremely annoying. Surfing the web isn't too bad, assuming you can stand the random 5-30 second waits. But some connections are sensitive enough to timeout, and it certainly makes multiplayer games unplayable. Facts: I confirmed the problem using ping google.com -t. I get normal traffic, interspersed with bursts of "Request timed out.". I've never had this problem before with this laptop. I didn't bring my own router or modem to the apartment. I'm using what the old tenant had. Hooking directly to the modem via an ethernet cable results in a stable connection. Temporarily cutting power to the router sometimes fixes the problem. Sometimes it doesn't. I reset the router, but the problem remained. Apparently the previous tenant had issues with the internet, but I don't know what they were specifically. The router is a D-Link DIR-615, and their tech support is useless.

    Read the article

  • EMC VNX iSCSI setup - unsure about SP/port assignment

    - by pauska
    We have a new VNX5300 waiting to get configured, and I need to plan out the network infrastructure before the EMC tech arrives. It has 4x1gbit iSCSI per SP (8 ports in total), and I'd like to get the most out of the performance until we jump over to 10gig iSCSI. From what I can read from the docs - the recommendation is to use only two ports per SP, with 1 active and 1 passive. Why is this? It seems kind of pointless to have quad-port i/o-modules and then recommend to not use more than two of them? Also - I'm a bit unsure about the zoning. The best practices guide state that you should separate each port on each SP from each other on different logical networks. Does this mean that I have to create 4 logical networks to be able to use all 8 ports? It also gives the following example: Does this mean that A0 and B0 should sit on the same physical switch aswell? Won't this make all traffic go on one switch (if both A1 and B1 are passive)? Edit: Another brainpuzzle I don't get it - each host (as in server) should not have more iSCSI bandwidth available than the storage processor. What on earth does this matter? If serverA have 1gbit and serverB have 100mbit, then the resulting bandwith between them is 100mbit. How can this result in some kind of oversubscription? Edit4: Wait, what. Active and passive ports? The VNX runs in a ALUA configuration with asymmetrical active/active.. there shouldn't be any passive ports, only preferred ones..

    Read the article

  • iSCSI performance questions

    - by RyanLambert
    Hi everyone, apologies for the long-winded post in advance... Attempting to troubleshoot some iSCSI sluggishness on a brand new vSphere deployment (still in test). Layout is as such: 3 VSphere hosts, each with 2x 10GB NICs plugged into a pair of Nexus 5020s with a 10gig back-to-back between them. NICs are port-channeled in an active/active redundant fashion (using vPC-mac pinning for those of you familiar with N1KV) Both NICs carry service console, vmotion, iSCSI, and guest traffic. iSCSI is on a single subnet/single VLAN that is not routed through our IP network (strictly layer2) Had this been a 1gig deployment, we probably would have split the iSCSI traffic off onto separate NICs, but the price/port gets rather ridiculous when you start throwing 4+ NICs to a server in a 10gigabit infrastructure, and I'm not really convinced it's necessary. Open to dialogue/tech facts re: this, though. At this point even a single VM guest will boot slowly to iSCSI storage (EMC CX4 on the same Nexus 5020 10gig switches), and restores of VMs from iSCSI take about twice as long as we'd expect them to. Our server folks mentioned that if we split the iSCSI off onto its own NIC, performance seems significantly better. From a network perspective, I've run through the variables I can think of (port configuration errors, MTU problems, congestion etc.) and I'm coming up dry. There really is no other traffic on these hosts other than the very specific test being performed at the time. Important thing to note is that guest traffic works just fine... it seems storage is the only thing affected by whatever gremlin exists. Concluding that we're not 'overutilizing' the network infrastructure since we're doing hardly anything, I'm just looking for some helpful tips/ideas we can use to resolve this... preferably without hurling extra 10gig NICs that are going to sit around 10% utilization while we've got 70+% left on our others.

    Read the article

  • Setup Firefox to save .pages as .zip automatically

    - by Mike Dtrick
    What do I want to do? I would like Firefox to save files with the .pages extension as .zip files automatically. Scenario You are browsing through your emails and you notice your friend just sent you an email with a file attached (a .pages in this example). Unfortunately, you have a laptop that runs Windows. Your friend continues to send tons of emails with .pages files attached and you are tired of manually saving the files as a .zip file. Ultimately, you would like Firefox to be set up so that the download/file manager recognizes the .pages extension and automatically converts it to a .zip file. What have I done? I have saved files manually by selecting save as "All Files" and setting the extension to .zip. I've gone through Firefox and their documentation and have not found anything on how to complete this task. Why am I doing this? To save time (only a few seconds, not the main reason). I would like to setup a simple solution that "converts" a file automatically without having to recall steps on how to achieve the task manually (for clients who aren't exactly tech savvy). So that clients with Windows can access the files. IMPORTANT NOTE: I am not trying to save the web page, rather an Apple document equivalent to Microsoft Word. UPDATE: The really easy method would be to save one file, right click it, choose properties and open all .pages files up with WinRAR (or any other program that extracts files from a compressed folder). For the sake of learning, I am going to "neglect" this method and continue to do some research on Firefox add-ons. I would still like to have Firefox or the download manager to do the bulk of the work for converting the file.

    Read the article

  • Issues using gmail with google apps and external domain

    - by Jonathan Kelly
    I have recently tried to use gmail through google apps as my main email client, but I'm experiencing a few different problems. I am managing the domain (conjunktiondesign.co.uk) through 123reg.co.uk but it is hosted through fasthosts.co.uk. I transfered the domain to 123reg as fasthosts did not allow me to change the MX records myself. I followed the setup instructions step by step on google apps and changed the MX records as they told me to. My email was now working perfectly but my website was down and I was getting the following error: The dnsserver returned: No DNS records I have a friend that is using the same system as me (ie. Externally hosted domain and google apps mail) and I changed my 123reg details to the same that he had (as his was working perfectly - both email and website). I changed my name servers to point to fasthosts, rather than 123reg and I added an A record called '@' pointing to fasthosts IP address. I also created another A record called 'www' pointing to fasthosts IP address. After I did this, my website worked almost immediately but I have only realised that since changing it my email is now down. I have not received anything since Saturday. I am a web designer and would consider myself fairly tech savvy, but I have no idea about A records, CNAME's and all the things I have been messing about with! What I ultimately need is someone to help me get my email and website working at the same time, rather than one being down when the other is OK. I seem only able to get one or the other working. I have now changed the name servers back to 123reg in an attempt to get my email back as it is more important than my website at this stage. Any help is much appreciated. Thanks.

    Read the article

  • Mac mini simple customized, Mac mini server or other?

    - by microspino
    I'm in front of a big IT choice for my little office and I need some advice. We have 5 users, 1 super user, 1 HP500 DesignJet Plotter, other 4 laser printers, 1 HP Fax/Print/Scan/Copy machine. All the clients are XP Sp3 boxes. We would like to: centralize and share 90Gb of files using a Dropbox (this way we will have LAN sync of local working directories + internet backup + access our files wherever we are). centralize our plotter, printers and fax machine backup all the workstations share outlook calendar and tasks run 24x7 saving some energy Of course this setup It's just the first step to a more serious and creative network management of our office, so we are open to new ideas. The budget vary from 400€ to 900€, we are not tech gurus but at least one of us is a power user close to become a geek. I've read some articles on macminicolo about a mac mini either normal or with snow leopard server. I heard about Windows Home Server too on the lifehacker website but I'm in a sort of analysis - paralysis can You help me?

    Read the article

  • Microsoft Access: computer freezes when user tries to update record

    - by CarlF
    A colleague and I have developed an Access 2003 database which is used throughout our department. Currently about four dozen people do data entry using one of two very similar forms. If 47 of us use them, they work perfectly. If Mr. 48 clicks the "Save" button, Windows XP freezes and a hard reset is needed. The problem has to be on his specific computer (Dell latitude D630) and not in the code because this problem only affects him. Complicating the matter: I don't work for IS, and this project is not supported by IS. If I'm going to get our tech support to fix the problem I had better be able to explain exactly what to do and how to do it, because they aren't going to invest any resources. I don't even have admin rights on the computer (and neither does its regular user). I've asked him to bring his laptop the next time he visits my building. (Just to make matters worse, he doesn't usually work in the same location as me or the other developer.) Any suggestions on debugging the problem? My first try will be to uninstall and reinstall Office, which I can do using corporate utilities without being admin. Note: yes, those are old versions of Office and Windows. We expect to upgrade later this year.

    Read the article

  • Sharing a folder with Nautilus and NTFS external drive gets errors

    - by TheLQ
    I am trying to share a folder in Lubuntu over a network that's on an external NTFS drive. Due to the system that I have (rotating backup disks) this is probably the second time that the drive would of been mounted. Its manually mounted with a simple (for example) mount /dev/sdb1 /media/BACKUP On an internal NTFS disk I have successfully setup a network share and can access it. However on the external disk I can't from any other Windows computer. When setting up the share Nautilus said that it needs to change the other's permissions to allow for other users to write. However afterwords its still blank. Changing it to Read and Write just changes back to blank. Chowning the entire /media folder recursively and trying didn't work. Running PCManFM as root and changing didn't work. Adding "public=yes" to smb.conf and restarting didn't work. I'm out of idea's on what to do. What's weird is that it worked just fine on an internal NTFS disk, so why not the external one? Any solutions need to be able to managed inside of a gui (preferably Nautilus) as the person managing the machine isn't as tech savvy. Thanks

    Read the article

  • "Network Error - 53" while trying to mount NFS share in Windows Server 2008 client

    - by Mike B
    CentOS | Windows 2008 I've got a CentOS 5.5 server running nfsd. On the Windows side, I'm running Windows Server 2008 R2 Enterprise. I have the "Files Services" server role enabled and both Client for NFS and Server for NFS are on. I'm able to successfully connect/mount to the CentOS NFS share from other linux systems but am experiencing errors connecting to it from Windows. When I try to connect, I get the following: C:\Users\fooadmin>mount -o anon 10.10.10.10:/share/ z: Network Error - 53 Type 'NET HELPMSG 53' for more information. (IP and share name have been changed to protect the innocent :-) ) Additional information: I've verified low-level network connectivity between the Windows client and the NFS server with telnet (to the NFS on TCP/2049) so I know the port is open. I've further confirmed that inbound and outbound firewall ports are present and enabled. I came across a Microsoft tech note that suggested changing the "Provider Order" so "NFS Network" is above other items like Microsoft Windows Network. I changed this and restarted the NFS client - no luck. I've confirmed that the share folder on the NFS server is readable/writable by all (777) I've tried other variations of the mount command like: mount 10.10.10.10:/share/ z: and mount 10.10.10.10:/share z: and mount -o anon mtype=hard \\10.10.10.10:/share * No luck. As per the command output, I tried typing NET HELPMSG 53 but that doesn't tell me much. Just "The network path was not found". I'm lost on how to proceed with troubleshooting. Any ideas?

    Read the article

  • Why does clicking on Windows 7 Printer Properties Result In Driver Not Installed?

    - by octopusgrabbus
    The question I need to ask is has anyone heard of getting a "driver not installed" error when clicking on a printer's properties on Windows 7, and is there a workaround? Here are the details of the problem. One of our users has a Windows 7 desktop, and an HP LaserJet 4050 T connected to via a parallel-to-usb converter. The PLC5 universal driver was installed for series 4050 printers. I needed to install the PLC 6 driver, which completed successfully. The user is an administrator of the system, and I was prompted to and accepted running as Administrator to install the driver. After the install, I went to see the 4050's properties and was prompted that the PLC6 driver was not installed. I believe the PLC6 driver was installed, because the PLC5 driver resulted in receiving an official HP error page indicating the printer was "not set up for collating" as the second page of printing two copies of a one page email. This problem did not occur with the PLC 6 driver. Oddly enough, setting back to PLC5 produced the same error about the PLC5 driver not being installed. I ignored/dismissed the error box (did not re-install the driver), and reproduced the error, with the second page being the HP not set up for collating error page. Any thoughts on what is causing this and how to clear it would be appreciated. The closest fix I could find was on a Microsoft tech page, and they had me clear winsock out of a Administrator run command line, followed by a reboot. That did not fix the problem. I have also found this http://social.technet.microsoft.com/Forums/windowsserver/en-US/5101195b-3aca-4699-9a06-db4578614e2d/changing-driver-results-in-printer-driver-is-not-installed-error-on-server-2008?forum=winserverprint and will look into trying some of these suggestions, which appear to me to be a "shotgun" approach to fixing the problem.

    Read the article

  • Small store infrastructure - where to begin?

    - by KevinM1
    It looks like my older brother is about to change jobs - from lawyer to shooting range proprietor - and since I'm the family 'computer guy' I have the task of coming up with and setting up the in-store equipment. Only problem, I don't know how to start or where to look. I'm a web programmer, not an IT specialist. To that end, I figured I should ask the pros. Users: 3 (myself, my brother, and his business partner) Equipment: 1 Windows (likely 7) desktop for POS software, 1 Windows desktop/laptop for backroom use (bookkeeping, etc.) Other: ?? I'm looking for a reliable and, well, idiot-proof way to handle backups. Neither my brother nor his business partner are tech savvy (A web browser, email, MS Word and Excel are about the extent of their knowledge), so I need something they can handle. On-site would be preferable to off-site, given my brother's hesitance to have sensitive business data be handled by an outside source. I'm also looking for a small on-site server. I estimate that, at most, only 2-3 users will need access. A linux solution would keep costs down, but I'm concerned about Windows <- linux interoperability. Would the store security cameras' storage be handled by the security company, or would we have to stream that data to our own server? I know from my own experience with personal security that the company gives/loans a recording device to the home owner, but I'm not sure about business security. I know this sounds like a shopping list, and it's pretty vague. I wish I could give more detail, but between my own ignorance and things not being 100% nailed down on the business end, I'm a bit stuck. At the very least I'd like a nudge - links on a place to start, what to look for, things I need to think about, etc. - for this endeavor. Thanks.

    Read the article

  • Aging SBS needs updates / Thoughts for one-off, off-line complete backup?

    - by tcv
    Hey guys, So, we checked out the status of an SBS 2003 at one of our more recent, spend-averse clients and found it to be woefully out-of-date. Scary out of date. I think it's running IE2. Ok, maybe not that far back. Anyway, I was thinking that I could use some kind of disk-imaging software to image the four IDE drives within and, in the event the server gets some kind of Update Induced Indigestion, I could completely restore. Usually my go-to software for this is Acronis, but my client will likely balk at a $500 price tag for a one-off backup with their server product. I had thought we could use the boot media from, say, Backup & Recovery 10 to take an off-line image of all the drives. According to their CHAT tech support, however, it will not work. I pressed for the technical reasons and they said they'd email me. They haven't emailed me. They still might. This server is running SBS 2003, pre sp2. It's got four IDE disks. One is a Basic disk, which contains the O/S. The others are bound as a dynamic disk. You might ask: "Don't they already have backup software?" They do! Backup Exec, a very low-end version that won't even do VSS. I don't know much about BE, but it seems to me that if the worst were to happen, it would mean building a new server O/S, installing BE (if the media is available), then restoring. Would it even work? I can take the system down for hours to do a backup and my goal here is a pretty dead-simple restore if the worst happens. Any and all suggestions are exciting. m

    Read the article

  • Downmix surround to Dolby Pro-Logic at the OS/driver level in Windows 7?

    - by davr
    First off, I'm talking about Dolby Pro-Logic, a really old tech for encoding 4 audio channels (L/R/C/SR) into two analog outputs, and then extracting them again. It was used in surround sound systems in the last century. I have a modern PC that can output 5.1 analog audio (Three outputs on the back carry six channels of audio). But I have a really old surround sound reciever that only has a two-channel, L/R input, which it extracts 4 channels of audio from, and outputs to 5.1 speakers. What I want is some way for the OS, Windows 7, to act as if I really had 5.1 audio channels available, so applications produce surround audio, but before outputting it out of the back of my PC, apply Dolby Pro-Logic matrix encoding so that it outputs over only two channels. These two channels would then get sent to my receiver via a RCA cable, which would decode it again and drive the surround speakers. Is anything like this possible? I'm pretty sure I could do it at an application / codec level, but I'm looking for something that I just have to set once.

    Read the article

  • How to get data out of a Maxtor Shared Storage II that fails to boot?

    - by Jonik
    I've got a Maxtor Shared Storage II (RAID1 mode) which has developed some hardware failure, apparently: it fails to boot properly and is unreachable via network. When powering it on, it keeps making clunking/chirping disk noise and then sort of resets itself (with a flash of orange light in the usually-green LEDs); it then repeats this as if stuck in a loop. In fact, even the power button does nothing now – the only way I can affect the device at all is to plug in or pull out the power cord! (To be clear, I've come to regard this piece of garbage (which cost about 460 €) as my worst tech purchase ever. Even before this failure I had encountered many annoyances about the drive: 1) the software to manage it is rather crappy; 2) it is way noisier that what this type of device should be; 3) when your Mac comes out of sleep, Maxtor's "EasyManage" cannot re-mount the drive automatically.) Anyway, the question at hand is how to get my data out of it? As a very concrete first step, is there a way to open this thing without breaking the plastic casing into pieces? It is far from obvious to me how to get beyond this stage; it opens a little from one end but not from the other. If I somehow got the disks out, I could try mounting the disk(s) on one of the Macs or Linux boxes I have available (although I don't know yet if I'd need some adapters for that). (NB: for the purposes of this question, never mind any warranty or replacement issues – that's secondary to recovering the data.)

    Read the article

  • How do you gracefully upgrade mission critical systems to wildly disparate systems?

    - by Ernie
    In the span of the 12+ years of my career, I have yet to overcome this hurdle and I suspect the answer simply isn't easy or even possible, so I ask everyone here for their experience. Say that you're running into egregious problems that can only be fixed by moving from one platform to another - either from making a mistake in choosing the platform that was chosen years ago, or simply growing beyond what the system was originally designed for. You know for certain that the cruft that has built up over time will invariably mean that it will be nearly impossible to test for all the things that will certainly lead to tech support hell - which we all know leads to the loss of customers. Not that customers aren't already complaining about the egregious problems that already exist! The best possible way that I've discovered so far is to maybe devise a plan for the changeover, test it on a few clients, test it on a dozen clients, test it on a hundred clients, then finally finish the changeover for everyone and pray that you've worked out all the bugs with those first hundred and twenty, and that the animal by-products will not hit the ventilation system in the most spectacular fashion possible. However, that doesn't mean that it won't anyway. So say that you're moving from Exchange to Exim (or even just Sendmail to Exim). How do you handle it?

    Read the article

  • CMD/ADB - Autorun script to search, copy, and paste a file from android system to flash drive

    - by Outride
    I've looked around and can't find anything that answers my question. This is my first question, so any tips or thoughts are welcome, as well as an answer :p As explained in title, i want to create a script that launches, finds a file on android phone, copies it, and pastes it to a flash drive. As of right now, it's a mix of multiple tutorials, trial and error, and I'm at a point of giving up. As of right now, I have a flash drive, loaded with three scripts. As follows: Bold = name of file file.bat @echo off :: variables /min SET odrive=%odrive:~0,2% set backupcmd=xcopy /s /c /d /e /h /i /r /y echo off %backupcmd% "C:\Users\Outride\Desktop\kikDatabase.db" "%drive%\all" @echo off cls invisible.vbs CreateObject("Wscript.Shell").Run """" & WScript.Arguments(0) & """", 0, False launch.bat wscript.exe \invisible.vbs file.bat So far, I had to use android commander, manually go through the directory, find /data/data/kik.android/databases and then copy kikDatabase.db to my desktop. Then run this scrip. Yes i'm trying to pull the database to copy all my email contacts. I use launch.bat, which then makes file.bat invisible due to the invisible.vbs script. What would i need to do now to have the file searched for and copied to the flashdrive? Thanks in advance, i'll be glad to answer any questions if theres any :p just remember that i'm not exactly a tech expert haha EDIT* Cleared junk of prior edits. New - I now have a .bat script to recognize what drive the usb is on, and launch py_cmd (adb shell) This is the current script. pull.bat @echo off :: variables SET odrive=%odrive:~0,2% set launching=start "%drive%\Minimal ADB and Fastboot\py_cmd" echo off %launching% so how could I make it for the .bat or a new script, to type the following "adb pull /data/data/kik.android/databases/ %drive%\All\Database" into the adb terminal? please help! I've been racking my brain over this all night :3

    Read the article

  • ATI firepro will not detect a second DVI-D monitor

    - by John
    OK so weird issue here. I have previously been running 6 screens off of 3 of the older ATI firepro graphics cards but they had a problem with the heat sink getting too hot and warping the PCB resulting in total failure of the card, to replace my three dead cards I purchased a new-type ATI firepro with the newer heat sink design. I'm only using one at the moment to make sure they've fixed the problem before I waste more money on 2 more cards but this is where things start to get weird. The Firepro's only have one port on them, they connect to two monitors via a splitter cable going from the one port to two DVI connectors for the screens. When I plug two identical monitors in via their DVI inputs not matter what I do windows and Catalyst will only detect one screen. However if I use the VGA input on one of the screens with a VGA - DVI adaptor to plug it in to the card it works fine. This confuses me greatly. I'm currently using the ATI Firepro 2270 Graphics card with identical DELL U2311H screens. I can post the rest of the system spec as well if needed but I wouldn't have thought it would make much difference as it had no problem handling 6 screens before the graphics cards failed. Naturally both catalyst and ATI drivers are the most current version. ATI tech support has been absolutely zero help, they seemed to get stumped as soon as I verified that both screens were plugged in and connected properly. Anyone have any ideas?

    Read the article

  • Which project management software for technophobes who've never worked with something like that?

    - by Ernst
    Hi, Our director has asked me to get something to manage projects. Note that so far we haven't used anything of the sort. I did not get very clear instructions yet, probably because she doesn't know exactly what we need either. My guess is, we'll only find out while using something. I've looked at some, openworkbench, ganttproject, and microsoft project. The latter has the advantage of easy importing of users from exchange, are there others that do that (even if not directly, easily)? I don't think it's a critical requirement though. We're using some other custom software where I have to add users manually anyway and we're small enough that it's maybe once a month that I have to add or remove a user. In any case, I'm not in favour of buying anything, since I'm skeptic about us actually succeeding in putting it to good use, and even if we do, we will only during usage discover what we need. We're also not a tech shop, most people vary from not very technically adept to technophobic. This means we need something very userfriendly. I prefer to stay away from online solutions, since we deal with sensitive information and I prefer to keep that in house, but I guess it would be acceptable for the trial period. An intranet site is an option though, but preferably something that is easy to set up with IIS. Xplanner plus and redmine I found too hard to set up for this experiment. Some other options I haven't yet tried to install, but which look to complex for our technophobes: Endeavour Software Project Management, Project-Open, Trac. Any suggestions? Thanks, Ernst

    Read the article

< Previous Page | 64 65 66 67 68 69 70 71 72 73 74 75  | Next Page >