Search Results

Search found 34141 results on 1366 pages for 'even mien'.

Page 93/1366 | < Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >

  • CGI error from PHP when running exec() on IIS

    - by Patrick
    Windows Server 2003 x64 PHP 5.2 IIS 6.0 The program Ink2Png.exe is set with Everyone-Read and Execute permissions. As does its dependency (microsoft.ink.dll) PHP Safe Mode is off exec() is passed [the full exe path], space, [full path to another file] This other file also has full read permissions. The output directory has full write permissions. As soon as exec() is hit, the connection dies, the browser does not even receive a full set of http headers, and it reports a CGI error. Examining the output, it appears the program was not even run. Any ideas? How can I figure out what exactly is happening and get it running again? EDIT: Also, it is a .NET application, if that is significant in any way.

    Read the article

  • Netgear genie says my Internet is off, while Windows claims I am connected

    - by Manu
    I'm connected via wifi to my ISP's router/modem. While Windows says that I'm always connected, I keep getting messages from Netgear Genie that I've lost contact to the Internet, and I cannot access webpages until it comes back. There are two other computers in the house, one connected to the router via ethernet, the other via wifi, both seem to have no such problems. I've wondered if Netgear genie itself was the problem, but I am regularly disconnected even if I uninstall it. And I'd rather have it since it accurently tells me if I'm connected or not. Why does windows says I'm online if I can't access any online game, or website ? Is "netgear genie" the problem ? I've removed the connection, recreated it, I've even copied the settings to a usb key from the computer that has wifi access.

    Read the article

  • Database backup regardless of backup made through a control panel?

    - by developer
    I know that all CMS or CMC platforms have some sort of walkthrough for their users to fully backup and restore the database or the whole website. While we all can perform such backups (and there are even plugins which automate the whole procedure) and restore them in necessity (such as when we migrate to a new server), one can also backup the whole website by means of such control panels as Directadmin or Cpanel. Now, I just want to know if it is necessary for us to do database or website backups the way they are described by a specific CMC or CMS developer even after we perform a whole website backup in a control panel such as DA? An example of such CMS platforms is moodle. Moodle Docs describes how we can backup and restore moodle in here. So do we, out of necessity, have to make the backup the way described, or we can simply do it the Control panel way? Thanks

    Read the article

  • Unable to edit/delete/move /etc/my.cnf - Permission denied

    - by FlourishDNA
    I am trying to edit /etc/my.cnf as root user via ssh and I get following error while trying to save it I ma making changes to my.cnf as I want to tweak some values in my.cnf to meet Magento requirement like changing key_buffer_size= to higher value (128M). I assigned the value 128M to key_buffer_size= and tried to save it and then got an error. "Error writing /etc/my.cnf: Permission denied" I cant even restart MySQL successfully. [root@flourish ~]# service mysqld restart Stopping mysqld: [ OK ] MySQL Daemon failed to start. Starting mysqld: [FAILED] I can even delete or replace it with the fresh one. I tried uninstalling MySQL and re-installing but nothing worked. Permission -rw-r--r-- and Owner/Group root/root I hope there is some answer to this problem.

    Read the article

  • Maximum memory allocation for 32bit linux kernel

    - by LedZeppelin
    I was reading this article that talks about how maximum amount of ram dedicated for kernel usage in 32 bit windows is 2GB even when the total amount of ram is 4GB. http://www.brianmadden.com/blogs/brianmadden/archive/2004/02/19/the-4gb-windows-memory-limit-what-does-it-really-mean.aspx\ Is this the same for 32bit linux environments like 32-bit ubuntu 10.04? IE is the max kernel allocation 2GB ram even if the total main memory 4GB? If you increase the total amount of memory to 64GB of ram by recompiling the kernel with the PAE option enabled, what is the maximum amount of ram you can dedicate for kernel usage? Is it still 2GB? Or can you increase it?

    Read the article

  • Triple (3) Monitors under Linux

    - by widgisoft
    I have a 3 monitor setup (each 1680x1050) via an Nvidia NVS440 (2 GPUs, 2 outputs per GPU totalling 4 outputs); this works fine under Windows XP,7 but caused considerable headaches under Linux (Ubuntu 9.04). I had previously used an XFX 9600GT and the onboard XFX 9300GS to produce the same result but the card was noisy and power hungry and I was hoping that there was some magical switch in the NVS4400 that got rid of this annoying problem - turns out the NVS440 is just 2 cards on one physical PCB :-p (I searched the net high and low for people using this card under Linux but found nothing, if anything the card uses less power and is fan less so I was to benefit from it either way) Anyway, using either set up there were 5 solutions available: Have 3 separate X instances, all un joined Have 3 separate X instances, adjoined by Xinerama Have 2 separate X instances - One using twin-view, both adjoined by Xinerama Have 2 separate X instances - One using twin-view but no Xinerama Have a single Twin-view setup and leave the 3rd screen unplugged :-p The 4rd option, using 2 separate X instances and twinview (but no xinerama) was the best balance in terms of performance and usability but caused 2 really annoying issues You couldn't control (without altering the shortcuts) which screen an application opened onto - and once it was opened you couldn't move it to another screen without opening up terminal and forcing it to move Nvidia's overriding or falsifying of Xinerama breaks and the 2 screens joined by Twin view behave like a single huge screen causing popups to open in the middle of both screens and maximising of windows stretches to the width of the first 2 screens Firefox can only run one instance as the same user so having multiple firefox windows requires at least 2 users The second option "feels" like the right option, but OpenGL is basically disabled and playing any sort of game or even running anything graphical causes a huge performance drop and instability - even trying to run a basic emulator for gba or gens just causes the system to fall over. It works just enough to stare at your desktop and do nothing but as soon as you start doing some work - opening windows, dragging things around - running multiple copies of firefox it just really feels slow. The last open, only going dual screen works perfectly and everything performs as required, full GPU acceleration - two logical screen spaces - perfect, just make it work across GPUs like windows! :-p Anyway, I know RandR was supposed to pick up the slack when it would introduced GPU objects of sorts to allow multiple GPUs to be stitched together to create one huge desktop at a much deeper layer than Xinerama. I was wondering if this has now been fixed (I noticed X server 1.7 is out) and whether anyone has got it running successfully? Again, my requirements are: One huge desktop to drag any window across Maximising of windows to each screen (as XP does) Running fullscreen apps on the primary screen and disabling the mouse from moving onto the others or on all 3 stretched Finally as a side note; I am aware of the Matrox triple (and dual) head splitter but even the price they go for on eBay is more than I can afford atm, my argument: I shouldn't have to buy extra hardware to get something to work on Linux when it's something that's existed in the windows world for a long time (can you tell I don't get on with X :-p); If I had the cash I'd have bought the latest version of this box already (the new version finally supports large resolutions as the displays I have 1680x1050 each).

    Read the article

  • CentOS - Yum doesn't update anymore?

    - by Xanathos
    I've been trying to use yum now, but for some reason, not even the search work anymore. I even tried putting packages I already downloaded in the search criteria and is the same. [root@AMDFX03 Downloads]# yum search glibc Loaded plugins: fastestmirror, refresh-packagekit, security Loading mirror speeds from cached hostfile epel/metalink | 22 kB 00:00 * base: centos.secrel.com.br * epel: archive.linux.duke.edu * extras: centos.secrel.com.br * rpmforge: apt.sw.be * updates: centos.secrel.com.br adobe-linux-x86_64/primary | 1.2 kB 00:00 http://linuxdownload.adobe.com/linux/x86_64/repodata/primary.xml.gz: [Errno -1] Metadata file does not match checksum Trying other mirror. Error: failure: repodata/primary.xml.gz from adobe-linux-x86_64: [Errno 256] No more mirrors to try. This error always appear no matter what I do. Please, can you tell me how to fix this, or at least how to reset yum's configuration?

    Read the article

  • Unrecognized file format .mdb in Microsoft Access -- repair doesn't fix it

    - by user1282159
    So what I have is a file from a staff computer that I believe is an access file b/c its called .mdb, however it does not open! I even tried to follow the repair steps (create a new file and use the "compact and repair" tool. and all I keep getting is "unrecognized file format *.mdb" (replace the * with the filename). I am not even sure it is an access file. I have tried using Office 2007 and office 2010 but neither work. Is there a way to fix this that is not on the Microsoft website? Or to determine whether this is actually an access file and not some other file with the extension renamed? Any help would be appricated, thanks.

    Read the article

  • Can you set up a gaming LAN using OpenVPN installed in a VMware guest OS and be playing the game on the host OS?

    - by Coder
    I would like to setup a gaming VPN. Ie. I have some games that work over LAN and would like to play them with people that are not on my LAN. I know I can do this with OpenVPN. My ultimate goal would be to run OpenVPN portably on my host OS and not even need any virtualization. As such i don't want to install it on my host, but i'm fine with running it portably. I'm even fine with temporarily adding registry keys, and then running a .reg file to remove these entries once i'm done. To this effect i have installed OpenVPN on a virtual machine and diffed the registry. I then manually (using a .reg file) added all the keys that seem important on my host OS and copied the installation folder of OpenVPN onto my host machine. Then i try to run openVPN GUI 1.0.3 as a test and it says "Error opening registy for reading (HKLM\SOFTWARE\OpenVPN). OpenVPN is probably not installed". I verified that that key is indeed in the registry with all subkeys and it looks correct. I have tried running the GUI as an administrator and in compatibility mode with no success. I am running Windows 7. If this fails then i would be happy with installing OpenVPN on a virtual machine in VMWare but they key is that i will be running the game installed on my host machine. The first question for this option is if this is even possible. The second is, that I can't get the VM to have internet access if I use bridging but i can if i use NAT. Is it possible to do this game VPN setup with VMWare guest OS running using NAT? Summary of questions: -Is it possible to run openVPN portably and if so what did i miss above? -If it's not possible to run it portably, then can setup a gaming LAN by installing OpenVPN in a guest OS with NAT and how can i do this? -If the above is not possible then can i install OpenVPN in a guest using bridging and if so how can i set this up with a Windows 7 host and Windows XP guest as currently i can't get the guest to be able to access the internet in bridging mode, but it working in NAT mode. -In general is there any good documentation on setting up a gaming LAN with OpenVPN (i am using 2.1.4) as i have never set up a VPN of any sort before so any help would be much appreciated. Thanks!

    Read the article

  • Du Meter Log file

    - by Jack
    Where can I find the Du Meter Log file? I tried searching C:\ProgramData\Hagel Technologies\DU Meter but the folder is empty. I also tried C:\Users\Username\AppData\Roaming and Local and LocalLow but none of them even have a Du Meter or Hagel Technologies folder. I even tried searching the temp folder but still nothing. I have a NetMeter.csv log file that I want to try and replace over the Du Meter log file cause I can't seem to find any other way to import data into Du Meter.

    Read the article

  • WSUS Showing Incorrect Version & Client Update Failure but they can check-in

    - by user132199
    One of the issues we are having is the clients will not download the updates from our WSUS server. They check-in as they are suppose too and find applicable updates but they are unable to actually download and install them. The GPO is set correctly. We decided to install the patch KB2720211to see if it would help eleviate this issue but it did not. In fact, even stranger, if I check the version that is installed on WSUS it reads 3.2.7600.226 but as far as I know it should read 3.2.7600.251. If I check Add/Remove programs to see what Windows Updates have been installed it even lists for WSUS that KB2720211 has been installed at version 3.2.7600.251. To install this update I followed the following directions Question: Has anyone seen this issue where the patch is installed yet not showing the correct version? What can I try to get my clients to update?

    Read the article

  • Testing php mail() in localhost problem.

    - by Samir Ghobril
    Hey guys, recently I just installed msmtp in linux and I even send a mail from the terminal and it worked: echo -e "Subject: Test Mail\r\n\r\nThis is a test mail" |msmtp --debug --from=default -t [email protected] But in php, after editing the php.ini file to have this: sendmail_path = '/usr/bin/msmtp -t' and using this piece of code: <?php if ( mail ( '[email protected]', 'Test mail from localhost', 'Working Fine.' ) ){ echo 'Mail sent'; } else{ echo 'Error. Please check error log.'; } ?> I get the Mail sent message but don't receive a message in my inbox. Not even in the spam folder. Anything wrong I'm doing?

    Read the article

  • Windows XP failing to set theme correctly on auto-login

    - by Alois Mahdal
    On several testing machines we have, when automatic login (as Administrator) is activated, Windows fails to set theme (Display properties - Themes) correctly. Particularly, even if theme is set to "Windows Classic", visually it's obvious that "Windows XP" is applied (the one with blue title bars and red "X" butons). I have only seen this happen when Auto-login is set--we always use Administrator on XP. Even if I log out and back in manually, theme is set correctly. Apart from logging out and in, it's also possible to reset theme in "Display properties". It does not happen in 100% of the cases, but it's way over 50%. Definitely it's often enough to be annoying. I believe this is a bug in Windows XP. I have never encountered it on other Windows versions. Does anybody know how to avoid this issue once forever? (Or can anybody provide explanation, relevant links, etc.?)

    Read the article

  • Chef cookbooks marked as executed

    - by Gonzalo Alvarez
    I have setup a Chef server in our network that I use to manage several nodes. These nodes have a chef-client installed executing as a daemon every X minutes. The problem is that every time the client runs, it executes the recipes for all the cookbooks, even those previously executed, so it consumes resources and sometimes it even breaks thinks (with services restarts, for example). I know that I can avoid executing a piece of code or a recipe I create as detailed here: Prevent chef recipe from executing previously executed action? but, would this mean that I should modify any cookbook that I download from the opscode repository. In other words, is it possible to make the chef server (or the clients) to mark the cookbooks as 'executed' as soon as they are executed the first time?

    Read the article

  • Windows Server 2008 constantly spamming external IP's on outbound TCP port 445

    - by RSXAdmin
    Hi Server Fault, I have a Windows Server 2008 box running as a Domain Controller. I have noticed in my Cisco ASA firewall logs that this box is continuously sending out (like a thousand requests a second) requests on TCP port 445 to external hosts. I have made an effort to deny this outbound traffic from getting on the internet (using the ASA), however I would like these requests to stop from even occurring at all. I have tried disabling TCP/IP over NetBIOS. I have even turned on Windows Advanced Firewall on the box itself to block outbound 445 but the ASA still detects this particular traffic hitting it. I have other DC's and similar type boxes which are not behaving the same way as this box. Is this normal? Is there a way to stop this spamming? Have I been infected? Thank you universe.

    Read the article

  • All terminal commands (like ls, cd, edit, open) are returning errors on my Mac

    - by park
    From what I can tell from reading other questions/answers is that my .bash_profile file may be corrupt. If I type echo $PATH in terminal the result is: /usr/local/git/bin From what I've read, that's not what the result is supposed to be. But I also can't get any of the commands (like edit or subl, for Sublime Text 2) to open the .bash_profile file to edit it. I was able to open the file in TextEdit using "cmd-shift-.", and here's what's in the file: [[ -s "$HOME/.rvm/scripts/rvm" ]] && source "$HOME/.rvm/scripts/rvm" PATH=$PATH:~/bin export PATH export PATH=/usr/local/git/bin But the file is LOCKED, so I can't edit it there either. I'm very new to programming and in the middle of trying to install everything on my Mac to go through a Ruby on Rails tutorial. I can't even check my version of ruby, since even ruby -v returns -bash: ruby: command not found Any help would be greatly appreciated. Thanks.

    Read the article

  • ext4: error loading journal

    - by cloudyOutside
    I have an external hard drive with two partitions: A small FAT32 which is mostly empty and works fine and a large ext4 with tons of data, most of which isn't backed up. The ext4 is visible, but can't be mounted. I get an "error loading journal" error. The drive is a Western Digital Caviar Blue 500GB. Roughly 30GB of that is FAT32 and the rest is the ext4. The light on the enclosure turns red when reading from the bad partition. It was made by Cavalry. There wasn't any warning, but coincidentally, I've been thinking lately that I should get two large capacity drives for real backups. Is there anything that can be done? I'm not even sure I have enough storage to backup everything even if it is redeemable.

    Read the article

  • No internet on some devices, still on WiFi

    - by Joost
    Ok, here's the situation. I live with 3 friends, and we often sit in the living room with laptops out. There's a WiFi router in the kitchen, and we have quite a stable connection. The issue, however, is this: sometimes, all of a sudden one of my friends loses internet connection. Even disconnecting/reconnecting to the WiFi network does not help. When we restart the router, though, it works like a charm again. The odd thing is, it happens just for this one Windows 7 laptop. We've switched routers (for a different reason) in the mean time, but the issue remained. This makes me suspect the laptop even more. I realise I've given little details, but I don't really know what to try/do. Any suggestions what it might be?

    Read the article

  • Redundant Router and Load Balancing vs. DDoS attack

    - by colgatta
    With a small server farm at a hoster with great support and conditions, I worry about the increasing number of DDoS attacks against this hoster (not my web project, but other clients on the same location). I have booked a redundant router and load balancer as managed service with this hoster to share the load with all the dedicated servers. However, I was lost again today because another one's project was attacked with DDoS for hours :-( Each hour means hundreds of dollars loss whenever my adserver and tracking is not reachable. Even time-out advertising have to be paid by me but can not be resold to my clients without the servers being available. All the time, the servers, the load and traffic is OK and health, but no chance to keep this stable/online if the hoster is vulnerable. Anyone has ideas or suggestions how to protect - even against DDoS?

    Read the article

  • SAN/NAS with high availability?

    - by netvope
    I have two servers that I plan to use for storage. Each of them has a few SATA disks directly attached. I want the storage to be available even if one of the storage servers is down (preferably the clients wouldn't even notice that the fail-over, although I'm not sure if this is possible). The clients may access the storage via NFS and samba, but this is not a must; I could use something else if needed. I found this guide, Installing and Configuring Openfiler with DRBD and Heartbeat, which apparently does the thing I want. It relies on three components, Openfiler, DRBD, and Heartbeat, and all three of them need to be configured separately. I'm wondering are there simpler solutions? Is using DRBD+Heartbeat the best practice for a situation like mine? I'm also interested to know if there are alternatives that don't depend on DRBD.

    Read the article

  • Windows 8 to 8.1 Pro Upgrade SecureBoot Error

    - by Alexandru
    I upgraded from Windows 8 to Windows 8.1. I have an Alienware Aurora R4 with the latest BIOS firmware version, A09. Ever since I did the upgrade, I get a watermark on my desktop saying, "SecureBoot isn't configured correctly"...I would like to get rid of this watermark the correct way (not by hacking system DLLs). My BIOS shows me booting in UEFI mode, and I see that SecureBoot is actually disabled from there. I cannot enable SecureBoot, in either UEFI mode or Legacy Boot mode. Note, I can't even get Legacy Boot mode working without re-formatting my system which I really don't plan on doing, so my question is this...what has changed in the way Windows handles SecureBoot? As far as I can tell, I do not have SecureBoot enabled, and it is trying to tell me that it isn't configured correctly. Why does it even care to check if my BIOS doesn't have it on anyways?! Its so frustrating!

    Read the article

  • Ubuntu karmic 9.10 Live image on USB - not working.

    - by Vivek Sharma
    This is my configuration 4GB pendrive, HP ubuntu-9.10-desktop-i386 image file for live USB install pendrivelinux (u910p) and ubetbootin (unetbootin.sourceforge.net) machine T61 Earlier I have installed ubuntu live image using above two mentioned utilities, numerous times. But, on a 2gb kingston flash-drive. Today, i am trying to install the live-image on 4gb HP flash-drive. Both the utilities install, i can see the files in the drive, even the wubi-installer is working, it say press "reboot" to boot in live-ubuntu. But, when i press "reboot" it does not reboot my win7. Now, when i reboot, select boot-usb in bios, it say "no boot record". I am making my usb bootable, using the utility, even then nothing is working out. Did this a few times. Is 4GB usb a prob, does anyone knows how to partition my usb in 2-2gb and install it on one partition, and then use the live image. Is it possible.

    Read the article

  • Virtual Windows Servers and Pagefile location [closed]

    - by Luke Puplett
    Considering that Windows makes heavy use of the pagefile even with huge amounts of RAM available, is it not best to have this pagefile on the fastest disk possible as close to the virtual systems as possible? I'm thinking, RAM disk. Where I work, storage for VMs is out on a NAS/SAN. I'm worried that so much memory access is having to go across the network! As a side, I think its about time MS got rid of paging and told us to buy more DIMMS. UPDATE So this question has been downvoted??! Accessing a local spindle is C40,000 times slower than a DIMM, so going over the network will be even slower for hard faults. I don't know why I got the downvote, I'm certain that this is an issue unless there's some other mechanism in ESX/HyperV that manages this.

    Read the article

  • MongoDB REST interface not listening after update

    - by Ones and Zeroes
    I replaced the mongodb-10gen install with the Ubuntu package (mongodb-server, mongodb-client and dev). apt-get install mongodb Thereafter, I am now unable to connect to the REST interface, where it worked before. Doing a wget to http://127.0.0.1:27018, I receive the following response: Connecting to 127.0.0.1:27018... failed: Connection refused. My previous /etc/mongodb.conf file had the following in: #enable REST rest = true Adding it to the packaged conf file does not resolve the issue, not even after restarting. I also tried changing the following with no effect: # Disable the HTTP interface (Defaults to localhost:27018). # nohttpinterface = true to # Disable the HTTP interface (Defaults to localhost:27018). nohttpinterface = false I have searched for days, and there doesn't seem to be anything on the Mongo site about a similar anomaly. If you have encountered a similar issue on Ubuntu Oneiric, please add your comments, even if you haven't found a solution to this issue.

    Read the article

  • ShoreTel 230 phone calls from website

    - by Michael Irey
    Our company uses these fancy ShoreTel 230 phones. We make many phone calls from our custom built web based contact management system. It would be nice if our employees could click on a phone number from a webpage and have it automatically start dialing the number. (Similar to how iPhone handles this) Anyone every deploy something like this? I would imagine it would require some kind of background running ShoreTel process to accommodate this. 90% of our employees use PC (Windows 7) 10% use OS X Even a PC only solution would be great. Is this even possible and if so, where should one begin? Thanks!

    Read the article

< Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >