Search Results

Search found 18976 results on 760 pages for 'ash machine'.

Page 127/760 | < Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >

  • Where do you earn more money (Autonomous Systems vs Distributed Systems)? [closed]

    - by Puckl
    I am interested in both topics and I can choose between them for my computer science master. I think the distributed systems master focuses more on software technologies and the autononmous systems master is focused on robotics and machine learning. Do you get good jobs in the fild of machine learning without a Ph.D.? I guess there are more jobs available in the Software-Tech world, is this right? Where do you earn more money? (It is not the only criteria, but it matters)

    Read the article

  • Un ordinateur réussit pour la première fois le test de Turing en se faisant passer pour un garçon de 13 ans

    Un ordinateur réussit pour la première fois le test de Turing en se faisant passer pour un garçon de 13 ansUn ordinateur grâce à un programme informatique a réussi pour la première fois à convaincre des chercheurs qu'il était un enfant de 13 ans, devenant ainsi la première machine à passer le test Turing.L'exploit réalisé par cette machine marque une date qui sera probablement écrite dans les annales de l'informatique et plus précisément de l'intelligence artificielle. Le test de Turing a été établi...

    Read the article

  • I can't install Truecrypt on Ubuntu 12.04

    - by Duanek
    I have a Windows 7 machine with Truecrypt. I installed Ubuntu 12.04 with the intention of converting to an Ubuntu only machine but I can't install Truecrypt on Ubuntu. I have been looking through the forums and have followed all advice to the letter and still Truecrypt doesn't work; I have only a non-functional icon in "dash". Should I uninstall Ubuntu and start over (ie reinstall)? I love Ubuntu and greatly appreciate your efforts on this forum.

    Read the article

  • Keeping remote files synced with local files?

    - by Kelp
    Hello, When developing web applications, how does one keep local files and remote files synced together? There is the obvious way, whenever you edit a file on your local machine, just upload that file to the remote machine. Is there a more efficient way? I ask because I have been using subversion control, and it is so easy to keep files synced on a remote server. All I have to do is "commit" and it will find the files which need to be replaced.

    Read the article

  • Why does Ubuntu 12.04 dual boot fail?

    - by Tranas
    Fresh install of XP followed by a fresh install of Ubuntu 12.04 results in the following error: error: unknown filesystem. grub rescue and the machine will not boot. Prior to the 12.04 install, XP worked fine. During the 12.04 install, all partitions and free space was visible, and the install seemed to complete without issues until the error message. Although I can fix the MBR via recovery console in XP and allow the machine to boot to windows, why is GRUB/Ubuntu trashing the boot sequence?

    Read the article

  • CNC Information - Data Storage and Transfer

    A CNC machine must be tried when there is need to improve speed and accuracy. The machine performs better in doing repetitive tasks and getting large jobs done quicker. Woodworking shops or industria... [Author: Scheygen Smith - Computers and Internet - March 21, 2010]

    Read the article

  • ??????Exadata V2???????????????????????

    - by mamoru.kobayashi
    ??????Exadata V2??????????????????????? ???????????Oracle Exadata Version 2????????????????????????????????????????????????????? ?Sun Oracle Database Machine*??????????????????????????????????????·??·????????????????????????????????????????????????????????IT????????????????Oracle Exadata Version 2??????Oracle Business Intelligence Enterprise Edition???Oracle Enterprise Manager?????????? *?Sun Oracle Database Machine???Oracle Exadata Version 2???????????????·?????? ????????????????

    Read the article

  • Ubuntu Lucid: Erratic screen behaviour after boot

    - by fgysin
    In short: about 50% of the time I have a screwed up monitor setup after reboot. About 50% it is totally correct. Now the longer version: I updated my machine from 9.04 to 10.04 (via 9.10). At first I run into some monitor problems (I have a 3-monitor setup) because of the known bug in the new xserver driver for xinerama. This messes up behaviour if the mouse goes either left or above the screen number 0, i.e. I had to make my left-most monitor screen 0. Everything worked out fine finally, I got my 3-monitor setup back with xinerama enabled to get one big desktop streched over 3 screens. Now the fun part: Every time I start up my machine only one of the 3 monitors gets a signal and is woken up: it only recognizes the left-most monitor (screen 0) and crams all the desktop stuff into this one screen. If I go into nvidia settings I only see one physical device although all 3 are connected and have power. When I look into the xorg.conf I can still see my old setup with 3 devices, 3 screens, xinerama active etc... But I was totally unable to get 3 montitors to work. (I tried unplugging monitors, reconfiguring whole nvidia setup, ...) But it gets even better: When I restart my machine (i.e. choose the restart option from the Ubuntu menu) it shuts down and tries to restart. The restart then gets stuck after showing the Ubuntu splash screen with the 'loading bar' (the moving dots thingy) and I am forced to kill the machine by cutting power. But after the power cut the machine boots up normally and suddenly I get my 3 monitor setup back up working. That is until the next time I shut down and start up, where it all starts over again and I only have one monitor... (see above) I really have a hard time seeing where the error is. It must be that the restart boot somehow differs from the 'normal' boot. But the fact that it gets stuck and I need to cut power which then basically triggers a 'normal' boot does not really support this theory... My setup (please tell me if you need further info): 3 monitors as 3 screens as one desktop (with xinerama) 2 nvidia cards where screen 0 and 1 are on card 0 and screen 2 is on card 1 Ubuntu 10.04 Lucid Lynx (updated from 9.10, 9.04, ....) I would appreciate every idea on the subject, at the moment I really don't have any clue what to do...

    Read the article

  • DVD Drive Failing on Windows 7

    - by Seth Spearman
    Hello, I have x64 Windows 7 running on an ASUS M50VM. The DVD drive works completely unreliably if not at all. But the story is not that simple so bear with me...here are the gory details. When I first got the machine it came with Windows XP and I upgraded it to Windows Vista X64 and the DVD worked fine. When Windows 7 RC2 came out I tried it on a Virtual Machine and I liked it so much that I upgraded the machine to Win7 RC1. The DVD worked fine. Of course, RC1 was going to start spontaneously rebooting, so when Windows 7 was released I DID A CLEAN INSTALL of Windows 7. Just to clarify...by clean install I mean I did a FORMAT of the HARD DRIVE and INSTALLED it from scratch. EVER since then the DVD mostly doesn't work. I can sometime read from disk but that will often hang. (Please see my description below of HANG for details.) CD or DVD writes ALWAYS fail with a HANG (I have done a successful write only one time.) Here is what I mean by HANG... *Explorer Window is unresponsive. *Any software accessing the DVD drive is unresponsive. *The DVD tray will not eject. *Using a paper clip will eject but the disk is usually spinning real hard. *Attempting to shut down windows will fail. I have waited as long as ten minutes but the whole OS seems to hang. I do a hard shutdown. *Sometimes accessing the DVD (when it does not cause a HANG) will still fail and the device will actually seem to disappear from the system until I reboot. A couple of other things. It is NOT a hardware failure. It is the Windows OS. I know this because I swapped out my DVD drive with a friend with the same model...his machine is fine (he is still running Vista X64) and my machine still fails. For what it is worth. I swapped out my primary disk with the INTEL 160GB SSD. EDIT Here is what System Information shows about my DVD drive Drive D: Description CD-ROM Drive Media Loaded No Media Type DVD Writer Name HL-DT-ST DVDRAM GSA-T50N ATA Device Manufacturer (Standard CD-ROM drives) Status OK Transfer Rate -1.00 kbytes/sec SCSI Target ID 0 PNP Device ID IDE\CDROMHL-DT-ST_DVDRAM_GSA-T50N________________RR04____\5&2B5B7F1D&0&1.0.0 Driver c:\windows\system32\drivers\cdrom.sys (6.1.7600.16385, 144.00 KB (147,456 bytes), 7/13/2009 7:19 PM) Any ideas? HELP! Seth B Spearman

    Read the article

  • DVD Drive Failing on Windows 7

    - by Seth Spearman
    I have x64 Windows 7 running on an ASUS M50VM. The DVD drive works completely unreliably if not at all. But the story is not that simple so bear with me...here are the gory details. When I first got the machine it came with Windows XP and I upgraded it to Windows Vista X64 and the DVD worked fine. When Windows 7 RC2 came out I tried it on a Virtual Machine and I liked it so much that I upgraded the machine to Win7 RC1. The DVD worked fine. Of course, RC1 was going to start spontaneously rebooting, so when Windows 7 was released I DID A CLEAN INSTALL of Windows 7. Just to clarify...by clean install I mean I did a FORMAT of the HARD DRIVE and INSTALLED it from scratch. EVER since then the DVD mostly doesn't work. I can sometime read from disk but that will often hang. (Please see my description below of HANG for details.) CD or DVD writes ALWAYS fail with a HANG (I have done a successful write only one time.) Here is what I mean by HANG... *Explorer Window is unresponsive. *Any software accessing the DVD drive is unresponsive. *The DVD tray will not eject. *Using a paper clip will eject but the disk is usually spinning real hard. *Attempting to shut down windows will fail. I have waited as long as ten minutes but the whole OS seems to hang. I do a hard shutdown. *Sometimes accessing the DVD (when it does not cause a HANG) will still fail and the device will actually seem to disappear from the system until I reboot. A couple of other things. It is NOT a hardware failure. It is the Windows OS. I know this because I swapped out my DVD drive with a friend with the same model...his machine is fine (he is still running Vista X64) and my machine still fails. For what it is worth. I swapped out my primary disk with the INTEL 160GB SSD. EDIT Here is what System Information shows about my DVD drive Drive D: Description CD-ROM Drive Media Loaded No Media Type DVD Writer Name HL-DT-ST DVDRAM GSA-T50N ATA Device Manufacturer (Standard CD-ROM drives) Status OK Transfer Rate -1.00 kbytes/sec SCSI Target ID 0 PNP Device ID IDE\CDROMHL-DT-ST_DVDRAM_GSA-T50N________________RR04____\5&2B5B7F1D&0&1.0.0 Driver c:\windows\system32\drivers\cdrom.sys (6.1.7600.16385, 144.00 KB (147,456 bytes), 7/13/2009 7:19 PM) Any ideas? HELP! Seth B Spearman

    Read the article

  • Vagrant - Failed login, ssh not set up

    - by motleydev
    This question is two fold because somewhere in my attempts to solve the problem, I created a new one. First: I was trying to vagrant up using a Vagranfile based on the standard hashicorp/precise32 box. Everything worked up until default: SSH auth method: private key where it would eventually time out. Enabling gui in the Vagrantfile showed that the machine never actually logs in. I can use the standard user/pass and log in from that point but the vagrant up process still remains at that prior status. Here's where my understanding might be a little dim. I've tried setting auth method from the insecure_pass_key to my root ~/.ssh/id_rsa or whichever one I wanted to use. I'm not entirely sure where to put a copy of the public key or my authorized keys file. I've got a .vagrant.d folder in my user dir (I'm on OSX) which seems to contain the box images. I've got a .vagrant folder in my directory with the Vagrantfile which seems to contain the specific machine I am building off of. I've tried pouring over the docs and forums but I seem to be missing a key concept here. And Now: After a host of tips/tricks such as rolling back by VirtualBox install, uninstalling/reinstalling Vagrant and VritualBox several times, when I try to run vagrant ssh-config with a Vagrantfile based on the same hashicorp/precise32 it says that the box is not enabled for SSH. Specifically, this error: The provider for this Vagrant-managed machine is reporting that it is not yet ready for SSH. Depending on your provider this can carry different meanings. Make sure your machine is created and running and try again. Additionally, check the output of `vagrant status` to verify that the machine is in the state that you expect. If you continue to get this error message, please view the documentation for the provider you're using. So now I am slightly up a creek. Any help would be appreciated if not just clarifying a concept. Some pertinent info: I'm on OSX Maverics Due to the fact I'm running a dual HD system with system files on one HD and user files on another, my permissions are a little wonky and VBoxManage will only let me run commands via Sudo - not sure if it's pertinent - but maybe. I have no idea what I'm doing. That part is perhaps more important.

    Read the article

  • Samba share not accessible from Win 7 - tried advice on superuser

    - by Roy Grubb
    I have an old Red Hat Linux box that I use, amongst other things, to run Samba. My Vista and remaining Win XP PC can access the p/w-protected Samba shares. I just set up a new Windows 7 64-bit Pro PC. Attempts to access the Samba shares by clicking on the Linux box's icon in 'Network' from this machine gave a Logon failure: unknown user name or bad password. message when I gave the correct credentials. So I followed the suggestions in Windows 7, connecting to Samba shares (also checked here but found LmCompatibilityLevel was already 1). This got me a little further. If click on the Linux box's icon in 'Network' from this machine I now see icons for the shared directories. But when I click on one of these, I get \\LX\share is not accessible. You might not have permission... etc. I tried making the Win 7 password the same as my Samba p/w (the user name was already the same). Same result. The Linux box does part of what I need for ecommerce - the in-house part, it's not accessible to the Internet. As my Linux Fu is weak, I have to avoid changes to the Linux box, so I'm hoping someone can tell me what to do to Win 7 to make it behave like XP and Vista when accessing this share. Help please!? Thanks Thanks for replying @Randolph. I had set 'Network security: LAN Manager authentication level' to Send LM & NTLM - use NTLMv2 session security if negotiated based on the advice in Windows 7, connecting to Samba shares and had restarted the machine, but that didn't work for me. I'll try playing with other Network security values. I have now tried the following: Network security: Allow Local System to use computer identity for NTLM: changed from Not Defined to "Enabled". Restarted machine Still says "\LX\share is not accessible. You might not have permission..." etc. Network security: Restrict NTLM: Add remote server exceptions for NTLM Authentication (added LX) Restarted machine Still says "\LX\share is not accessible. You might not have permission..." etc. I can't see any other Network security settings that might affect this. Any other ideas please? Thanks Roy

    Read the article

  • Ubuntu Lucid: Erratic screen behaviour after boot

    - by fgysin
    In short: about 50% of the time I have a screwed up monitor setup after reboot. About 50% it is totally correct. Now the longer version: I updated my machine from 9.04 to 10.04 (via 9.10). At first I run into some monitor problems (I have a 3-monitor setup) because of the known bug in the new xserver driver for xinerama. This messes up behaviour if the mouse goes either left or above the screen number 0, i.e. I had to make my left-most monitor screen 0. Everything worked out fine finally, I got my 3-monitor setup back with xinerama enabled to get one big desktop streched over 3 screens. Now the fun part: Every time I start up my machine only one of the 3 monitors gets a signal and is woken up: it only recognizes the left-most monitor (screen 0) and crams all the desktop stuff into this one screen. If I go into nvidia settings I only see one physical device although all 3 are connected and have power. When I look into the xorg.conf I can still see my old setup with 3 devices, 3 screens, xinerama active etc... But I was totally unable to get 3 montitors to work. (I tried unplugging monitors, reconfiguring whole nvidia setup, ...) But it gets even better: When I restart my machine (i.e. choose the restart option from the Ubuntu menu) it shuts down and tries to restart. The restart then gets stuck after showing the Ubuntu splash screen with the 'loading bar' (the moving dots thingy) and I am forced to kill the machine by cutting power. But after the power cut the machine boots up normally and suddenly I get my 3 monitor setup back up working. That is until the next time I shut down and start up, where it all starts over again and I only have one monitor... (see above) I really have a hard time seeing where the error is. It must be that the restart boot somehow differs from the 'normal' boot. But the fact that it gets stuck and I need to cut power which then basically triggers a 'normal' boot does not really support this theory... My setup (please tell me if you need further info): 3 monitors as 3 screens as one desktop (with xinerama) 2 nvidia cards where screen 0 and 1 are on card 0 and screen 2 is on card 1 Ubuntu 10.04 Lucid Lynx (updated from 9.10, 9.04, ....) I would appreciate every idea on the subject, at the moment I really don't have any clue what to do...

    Read the article

  • Why is the server performance so poor? What can be done to improve the speed of the server?

    - by fslsyed
    Very slow processing using Windows Server2008 R2 Standard with Service Pack One. Situation: Read a text file using the text data to populate a series of MS Sql tables. The converted data is used to generate monthly PDF invoice files; the PDF files are saved directly to the hard drive. The application is multi-threading with one thread used for the text conversion and three threads for PDF invoice generation. The text conversion is occurring concurrently with the invoice generation. Application Software: C# using Microsoft Visual Studio 2010 Ultimate. Crystal Report Writer 2011 with runtime 13_0_3 64 bit version. Targeted platform is x64; also tested as x86, and Any CPU with similar results. Microsoft .NET Framework 4.0. Microsoft Sql 2008 Issue: The software is running very slowly. The conversion of the text file is approximately six hundred fifty records per second and generation of the PDF files is approximately twelve invoices per minute. The text file to be converted is six hundred Meg with seven thousand invoices to be generated. The software was installed on three different machines from the same distribution files. The same text file was converted on each machine. The user executing the application was an administrator on each machine. The only variances were the machine and operating system. The configurations are as follows: Server: Operating System: Windows Server2008 R2 Standard 64-bit (6.1, Build7601) SP1 Service Pack: System Manufacturer: IBM System Model: System x3550 M3-[7944AC1]- BIOS: Default System BIOS Processor: Intel® Xeon® CPU E5620@ 2.4GHz (16 CPUs) Memory: 16384MB Notebook: Operating System: Windows 7 Home Premium Standard 64-bit (6.1, Build7601) System Manufacturer: Hewlett-Packard System Model: HP Pavilion dv7 Notebook PC BIOS: Default System BIOS Processor: AMD Phenom II N640 Dual-Core Processor 2.9GHz (2 CPUs) Memory: 6144MB Desktop: Operating System: Windows 7 Professional 64-bit (6.1, Build7601) SP1 System Manufacturer: Dell Inc. System Model: OptiPlex 960 BIOS: Phoenix ROM BIOS PLUS Version 1.10 A11 Processor: Intel Core™2 Quad CPU Q9650 @3.00GHZ (4 CPUs) Memory: 16384MB Processing results per machine: The applications were executed seven times with the averages being displayed below. Machine Text Records Invoices Generated Converted Per Minute Per Minute Server (1) 650 12 Notebook 980 17 Desktop 2,100 45 (1) The server is dedicated to execution of this application; no additional applications are being executed. Question: Why is the server performance so poor? What can be done to improve the speed of the server?

    Read the article

  • Recommendations or advice for shared computer control

    - by Telemachus
    Basic scenario: we are a school (overwhelmingly Mac, some Windows machines via BootCamp), and we are considering using DeepFreeze to guard the state of our shared machines. We have roughly 250 machines that are either shared laptops (which move around quite a bit) or common desktops in public spaces. Obviously, we spend a lot of time maintaining the machines and trying to reverse the inevitable drift as people make changes to the computers. We would like to control the integrity of the build we initially put onto the machines without handcuffing users and especially without using Mac's Parental Control software. (We've had nothing but bad experiences with it.) We've been testing DeepFreeze, and so far it's very impressive. But I'm curious to hear if people who have used DeepFreeze or any similar software have any advice or tips. To get things started, I will post my own pros and cons. Pros: The state of the machine is frozen in our chosen state. All changes made to the machine after that disappear upon restart. (This frozen state really appears to cover everything. I have yet to do something to a test machine that isn't instantly healed.) Tons of trivial but time-consuming maintenance is gone in an instant. Also, lots of not-so-trivial breakage should be avoided. There are good options, however, that allow you to create storage spaces either globally or per user. (Otherwise, stored files disappear upon reboot. For some machines, this is a good option itself. Simply warn people: save externally or else; this machine is a kiosk, not your storage space.) Cons: Anytime we actually need to make a change (upgrade basic software, add a printer or an airport permanently, add new software), the process is a bit more complex. Reboot into a special mode (thaw state), make changes, reboot back into frozen mode. If (when?) we forget this, we will end up making changes that disappear after the next reboot. Users will forget to save files correctly (in the right place or externally), and we will have loud, unpleasant conversations explaining that we can't recover the document they worked on all afternoon yesterday. The machine rebooted. The file is gone. These are my initial thoughts, but I would love to hear from other people who have experience with DeepFreeze or any similar software. What should we be careful about? Do the pros outweigh the cons? What gains or problems am I not seeing? Thanks.

    Read the article

  • 2nd Year College - Learning - Microsoft Server Products

    - by Ryan
    As the title says, I just finished my first year of college (majoring in Software Engineering). Fortunately my school likes Microsoft enough, and I can get pretty much anything I want that Microsoft sells. I also can get IBM Websphere and the like for free as well. Earlier this year, I set up an oldish computer (2.6 Pentium D, x64) to run ubuntu server headless. I'm predominately a Java developer, so Apache, Maven, Nexus, Sonar, SVN, etc made it onto the machine. It worked really well for personal and school projects, especially team projects (quick ramp up). Anyways, I started to pick up C# to complement my Java knowledge (don't judge me :P), and am interested in working with some of the associated Microsoft equivalents. The machine currently has the Ubuntu install, as well as Windows 7 Ultimate. I do all of my actual development work off my laptop, also running Windows 7 Ultimate. I was wondering what software you would recommend putting on the machine. I’m not actually serving anything off the machine itself, but in Ubuntu I had it doing integration tests with Hudson on every commit, and profiling my applications, etc, etc. The machine would be running headless, and I would remote into it. Here is what I am currently leaning towards / wondering about: Windows 7 Ultimate vs Windows Server 2008 (R2) (no one is really clear why I should go with one over the other) Windows Team Foundation Sharepoint (Never used it before, kind of meh about it) IBM Websphere or Glassfish (Some Java EE web server) SQL Server 2008 A DVCS In order to better control product conflicts / limit resource use, I’m wondering if I should install things into virtual machines (I can get VmWare or Microsoft Virtualization Products) I also plan on installing everything I had running under Linux (it’s almost entirely Java based development software, so it’ll run on both, only reason I went with ubuntu during the year was because the apache build seemed better). I’m primarily looking to become familiar with enterprise software development tools, as well as get something functional that will help my development process. (IE, I’ll still use project and assign tasks even though I might be the only one to assign tasks to, just to practice doing so). Is there any other software / configuration details I should explore? Opinions on my current list? I primarily use C#, Java, and PHP. I'm familiar with ruby, and python as well. Thanks!

    Read the article

  • Routing with VPN and asymmetric communication

    - by Louis
    I'm stumbling on a problem that requires your advice. Keywords : networking, route, openVPN Problem : I have a local network with several physical servers and VMs. These machines have ip's in the range 10.10.x.x. I can access these machines from the Internet with the help of openVPN. These machines can : access each other within the local 10.10.x.x subnet access the Internet via the VPN can themselves be accessed (via SSH) from the Internet via the VPN. There is one machine however that behaves strangely and I don't know why. I can SSH into this machine from anywhere via SSH and I can also PING it from anywhere (including the Internet). However from this machine (i.e. when logged into it) I cannot access the Internet or ping machines outside the local network. In other words it will not go beyond the VPN. My question is why? Here are some technical details: The machine's Network Config (running Debian 6.0.3): allow-hotplug eth0 iface eth0 inet static address 10.10.10.200 netmask 255.255.0.0 network 10.10.10.0 broadcast 10.10.10.255 gateway 10.10.10.200 The machine's Routing : Destination Gateway Genmask Flags MSS Window irtt Iface 127.0.0.1 0.0.0.0 255.255.255.255 UH 0 0 0 lo 10.10.0.0 10.10.10.250 255.255.0.0 UG 0 0 0 eth0 10.10.0.0 0.0.0.0 255.255.0.0 U 0 0 0 eth0 0.0.0.0 10.10.10.250 0.0.0.0 UG 0 0 0 eth0 0.0.0.0 10.10.10.200 0.0.0.0 UG 0 0 0 eth0 The VPN's Network Config (running Debian 6.0.3): # This is the local network interface auto eth1 allow-hotplug eth1 iface eth1 inet static address 10.10.10.250 netmask 255.255.0.0 broadcast 10.10.10.255 gateway 10.10.10.250 The VPN's routing table Destination Gateway Genmask Flags MSS Window irtt Iface 10.10.0.0 0.0.0.0 255.255.255.0 U 0 0 0 tun0 private 0.0.0.0 255.255.255.0 U 0 0 0 eth0 10.10.0.0 0.0.0.0 255.255.0.0 U 0 0 0 eth1 0.0.0.0 10.10.10.250 0.0.0.0 UG 0 0 0 eth1 0.0.0.0 private 0.0.0.0 UG 0 0 0 eth0 net.ipv4.ip_forward = 1 on both machines. there are no iptables set anywhere. Thanks in advance for any feedback.

    Read the article

  • Recovering from backup without original install media

    - by KGendron
    A machine from my old job had a complete hard drive failure. I have backups but I'm running into severe problems restoring from them. The only install media was a secondary restore partition on the system's hard drive. I hate whoever came up with that idea more than i can possibly express with words. I spent several days trying to recover the disk - it is pretty well shot and none of my best tricks could even get it to show up in the bios/ The machine that broke is an hp with xp media center edition on it (I don't know why either). The backups were created using the default windows backup tool - I have .bfk file on an external hardrive that i am trying to restore from. I've replaced the hard drive. My home machine is running windows 7 64bit and i'm trying to use it as a platform to restore to the other disk. I downloaded the window 7 nt-restore utility, however no matter what i do it restores to my C drive rather than the specified drive. Fortunately win7 security settings prevented it from being a complete disaster - but still not a happy thing. I tried firing up the xp virtual machine. I can browse to the backups but it says they are invalid and refuse to let me view/ continue with the restore. I tried installing XP to an extra harddrive on my machine - however it bluescreens on me during the install process and I cry. I tried installing xp pro to the new drive and attempted to restore over it, it of course blackscreened on me as that was a stupid idea. I made two partitions on the new hard drive (Apparently the bios on this accursed piece of junk doesn't allow hd partitions larger than 200G anyways and thus fails 40 minutes into the install with an ever-descriptive "Disk Read Error". Guess how i spent last weekend? My last idea was to install xp pro to the second partition and then use it to restore from backup to the first. After the first restart it gives me the error "Windows could not start because of a computer disk hardware configuration problem. Could not read from the selected boot disk. Check boot path and disk hardware". My brain made one of those bad hard drive clicky noises. I've tried several boot disks but they don't seem to work. If anyone has a link to a good one it would be greatly appreciated. Anyone have any more ideas? - I really hate asking on what seems like such a simple issue but i am quite literally at my wit's end. Thanks - and sorry for the really long post.

    Read the article

  • How to troubleshoot port forwarding on Windows 7 (64 Bit) with ICS enabled?

    - by LearnCocos2D
    I want to forward some ports (1666 for perforce, 8081 for Hudson) on my Internet Gateway machine. This machine is running Windows 7 (64 Bit, legal, user-account) and connected to the Internet via cable modem (it's not a router). The Windows machine is sharing its Internet Connection via ICS and that works fine on all connected computers. I can access the services via the gateway's public IP (95.x.x.x) on the given ports if they are running on the gateway machine itself. I've added the ports and destination IP address (192.168.0.18) in the Internet network adapter's Advanced Settings dialog (Sharing tab). That's the same dialog where you have a list of preconfigured services like HTTP, FTP and other incoming services. When I do that I can't connect to the services anymore. For some reason port forwarding isn't working. I have uninstalled Bitdefender because I wanted to check if the Firewall interferes. I've also disabled the Windows Firewall and Defender to no avail. I tried a freeware tool that helps to setup port forwarding but that didn't work either. The target machine is a Mac OS X computer whose Firewall is disabled. The IP is static. I can successfully connect to the services using the local IP address (192.168.0.18) from two different machines, including the gateway computer. So internally and externally it seems to me that the ports are open and not blocked, and the issue is with port forwarding itself. From what I understand it should be enough to add an entry to the Advanced Settings dialog to enable port forwarding when there are no firewalls interfering. How can I troubleshoot why port forwarding isn't working for me? What steps should I follow to alleviate the issue? PS: I gladly accept command line solutions. Other things I've tried: adding an Inbound Rule to Windows Firewall for the 1666, 8081 ports trying with Windows Firewall enabled and disabled disabling/enabling the network adapter double-checked that the IP addresses are correct mapping a different incoming port to the service's actual port followed or checked the misc tips in this article What I haven't dared trying yet (let me know if it's worth a shot): disable/enable ICS remove all network adapters (via Control Panel), then re-install and re-configure them

    Read the article

  • FTP through HAProxy

    - by Menda
    I have a machine, which is the Host and has HAProxy installed in it and working. Then I have a Guest KVM virtual machine running inside the Host with an IP 192.168.122.152. I installed an FTP server in the Guest machine with VSFTPD. From the Host, if I try the command $ ftp -p 192.168.122.152, works perfectly and I can connect to the Guest FTP. I need to remark that this FTP is configured as passive, but both passive and active connections are working from the Host. This is an extract of part of /etc/vsftpd.conf in the Guest: # Passive mode connect_from_port_20=NO tcp_wrappers=YES listen_address=192.168.122.152 pasv_enable=YES pasv_promiscuous=NO port_enable=YES port_promiscuous=NO pasv_max_port=10000 pasv_min_port=10250 Now it's time to make it accessible from outside, so I configure /etc/haproxy/haproxy.cfg like this: listen FTP_Default *:21 server ftp01 192.168.122.152 check port 21 inter 10s rise 1 fall 2 listen FTP_Range *:10000-10250 server ftp01 192.168.122.152 check port 21 inter 10s rise 1 fall 2 But if I try to connect from other machine in internet $ ftp -p $PUBLICIP, it only responds: Connected to <PUBLICIP>, but it doesn't ask for the login and password. Something in the HAProxy config must be wrong, because it's the only point where it fails. By the way, I tried to adapt my configuration to this one in this blog. Thanks.

    Read the article

  • James - mail server configuration help needed

    - by Chaitanya
    Hi, I am trying to setup James mail server on a linux machine. The linux machine has public static ip address assigned. I installed James and added in the config.xml added the servername as mydomain.com. In the DNS for mydomain.com, I have created a A-record, say mx.mydomain.com, which corresponds to the ipaddress of the above mail server machine. Then added mx.mydomain.com as MX record for mydomain.com. In James, I have created a new user test. Then from gmail, I sent a mail to [email protected]. The mail is not received back and it didn't even bounce back. The linux machine is behind a firewall with only 22, 80, 8080 ports open for external network. My question here is, Do I require do open any other ports on the firewall so that the mail I send from gmail arrives to James? If it's not the port problem, any views on solving this issue? I don't want to send mails from this server. It's only for receiving the mails.

    Read the article

  • Sending microphone input over Remote Desktop 7.0

    - by Taylor Price
    I am using Remote Desktop 7 (the new version that came out with Windows 7) to control a Windows XP Pro machine. I have selected "Record from this computer" in the Remote Audio settings. When I connect to the machine, go to the control panel, open the sound panel, and go to the audio tab, I find that the default sound playback device is "Microsoft RDP Audio Driver". However, there is no default sound recording device. As expected, my IP phone thinks there is no recording device. If I am sitting in front of the computer with a mic plugged in, it works just fine. Has anybody else been able to get this work appropriately? Is there anything that I have to setup on the XP machine to get this working? Thanks in advance. Edit: As John T pointed out below, you have to be connecting to a Windows 7 Enterprise or Ultimate machine for this to work. I've also found out that Multi-monitor support has the same requirement.

    Read the article

  • Disable local delivery in Sendmail

    - by Luke P M
    I am using Sendmail on a Centos server to send email for PHP scripts, but the problem is that mail is delivered to a local mailbox on the machine rather than what is specified in the MX records for the domain - which actually point to another machine I use for email. I would like sendmail to not try and locally deliver mail for the domain the machine is setup for, is there a simple way to disable local delivery? The domain is not in the local-host-names file. I've already done lots of googling and I have looked at: http://serverfault.com/questions/26934/sendmail-configuration-to-not-deliver-mail-to-local-machine http://serverfault.com/questions/65365/disable-local-delivery-in-sendmail But either there is no answer or it is not suitable. I don't want to relay to another server, i just want it to send mail regardless of domain. To provide an example: I have two servers, one is the mail server at mail.example.com and a web server which is example.com, when I use the smtp service on the web server it currently routes mail to a local mailbox on example.com, but it should be going to mailboxes on mail.example.com Output of sendmail -bt returns: ADDRESS TEST MODE (ruleset 3 NOT automatically invoked) Enter 3,0 [email protected] canonify input: info @ example . com Canonify2 input: info Canonify2 returns: info canonify returns: info parse input: info Parse0 input: info Parse0 returns: info ParseLocal input: info ParseLocal returns: info Parse1 input: info Parse1 returns: $# local $: info parse returns: $# local $: info

    Read the article

  • Appcrash and possible malware

    - by Chris Lively
    First off, I'm running MS Intune Endpoint Protection. It is completely up to date. On 10/25 @ 11:53PM I came across a site that caused Intune to freak out: Microsoft Antimalware has detected malware or other potentially unwanted software. For more information please see the following: http://go.microsoft.com/fwlink/?linkid=37020&name=Trojan:Win64/Sirefef.B&threatid=2147646729 Name: Trojan:Win64/Sirefef.B ID: 2147646729 Severity: Severe Category: Trojan Path: file:_C:\Windows\System32\consrv.dll Detection Origin: Local machine Detection Type: Concrete Detection Source: Real-Time Protection User: NT AUTHORITY\SYSTEM Process Name: C:\Windows\explorer.exe Signature Version: AV: 1.115.526.0, AS: 1.115.526.0, NIS: 10.7.0.0 Engine Version: AM: 1.1.7801.0, NIS: 2.0.7707.0 I, of course, elected to simply delete the file. Since then my machine has been randomly giving an error about "Host Process for Windows Services" stopped working. There are generally two different pieces of info: Description Faulting Application Path: C:\Windows\System32\svchost.exe Problem signature Problem Event Name: BEX64 Application Name: svchost.exe Application Version: 6.1.7600.16385 Application Timestamp: 4a5bc3c1 Fault Module Name: StackHash_52d4 Fault Module Version: 0.0.0.0 Fault Module Timestamp: 00000000 Exception Offset: 000062bdabe00000 Exception Code: c0000005 Exception Data: 0000000000000008 OS Version: 6.1.7601.2.1.0.256.27 Locale ID: 1033 Additional Information 1: 52d4 Additional Information 2: 52d47b8b925663f9d6437d7892cdf21b Additional Information 3: ed24 Additional Information 4: ed24528f3b69e8539b5c5c2158896d3e and Description Faulting Application Path: C:\Windows\System32\svchost.exe Problem signature Problem Event Name: APPCRASH Application Name: svchost.exe Application Version: 6.1.7600.16385 Application Timestamp: 4a5bc3c1 Fault Module Name: mshtml.dll Fault Module Version: 9.0.8112.16437 Fault Module Timestamp: 4e5f1784 Exception Code: c0000005 Exception Offset: 00000000002ed3c2 OS Version: 6.1.7601.2.1.0.256.27 Locale ID: 1033 Additional Information 1: 3e9e Additional Information 2: 3e9e8b83f6a5f2a25451516023078a83 Additional Information 3: 432a Additional Information 4: 432a0284c502cce3bbb92a3bd555fe65 Intune claims the machine is clean. I've also tried some of the online scanners like trendmicro, all of which claimed the system is clean. Finally, I tried the "sfc /scannow" and it said all was good. I left my machine on after I left last night and there were about 50 of those messages. Ideas on how to proceed?

    Read the article

< Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >