Search Results

Search found 40294 results on 1612 pages for 'dot files'.

Page 486/1612 | < Previous Page | 482 483 484 485 486 487 488 489 490 491 492 493  | Next Page >

  • Using Find, Grep, Awk, or Sed To Rename Server After Cloning

    - by ServerChecker
    My client tells me they have cloned a VM in VMWare of an Ubuntu Linux server. Now it's my job to get into all the files and find out what still has the old server name of "bishop" and change it to something else. Also, the IP address is changed and I need to search for that too. How would you typically use find, grep, awk, or sed to find these files and then change them rapidly? In the end, I want to make a Bash script. Of course, I'm not expecting you to tell me every file, but just want to know the technique for finding files with "x" in it and then switching that rapidly with "y".

    Read the article

  • RAR Password recovery / hash extraction on OS X

    - by Josh K
    I'm running 10.6.2 and have been handed a couple of old files that need to be extracted. Old backups or finances or bills I believe. They are RAR files, and password protected. Is there a way to extract the hash from these files so I can feed it into John The Ripper or Cain and Abel? Edit I have downloaded cRARk, but unfortunately nothing I have (SimplyRAR, RAR Expander, The Unarchiver) will extract it without a password. Can someone verify that I'm crazy and there is no password on the Mac version?

    Read the article

  • How can I make it difficult to install a new operating system on a certain computer?

    - by D W
    I want to host a website on a desktop computer running Ubuntu with a Windows virtual machine. I will give away the computer in exchange for a number of months of remote web hosting. I want to add some kind of lock (hardware or otherwise) so that the end users will have difficulty just reinstalling Windows and using the machine as they want, in contradiction to the contract. Ideally, I'd want the machine to die if reinstallation of the OS is attempted. It doesn't have to be completely insurmountable, but it has to be difficult enough to prevent casual reinstallation. Perhaps on bootup the system can check whether certain files exist on the computer and refuse to boot if they do not. I don't know if this is possible, but maybe BIOS is password protected, and searches for files before boot up. The files it looks for could be date sensitive, i.e. require remote replacement on a schedule.

    Read the article

  • local cache for NAS or network folder

    - by HugoRune
    I am planning to build a network attached storage (NAS) server. Is there a way to cache frequently acccessed files from the remote storage automatically on the local PC? (I am not looking for a way to sync whole folders like rsync, but rather something that automatically and transparently caches the last accessed 50 gb of files.) Ideally I am searching for something that caches writes as well as reads, since only one pc will be accessing the server (and one day of lost changes if the local cache is damaged would be acceptable) I looked into windows offline files, but as far as I could tell this requires manual interaction to disconnect the server or go into offline mode in order to use the cache. The server would probably be running Linux or freeNAS, the pc runs Windows xp, but could be upgraded to 7 if required.

    Read the article

  • Receiving "May not have permission to edit" warning, even though I have permissions

    - by Choy
    I'm using Panic Transmit as an FTP client connecting to an Ubuntu 12.x server. When I try to edit and upload a file using it, I receive the warning that tells me to check my permissions as I may not have permissions to edit a file. I'm not setting the permissions on upload and I do have permission to edit files. After clearing the warning and checking the file on the server, may changes go through. The files I'm trying to edit are set to 775 and are part of the www-data group which my user is part of as well. Any idea why I would be getting such a prompt? This only happens on some files, not all.

    Read the article

  • Allow Incoming Responses Apache. On Ubuntu 11.10 - Curl

    - by Daniel Adarve
    I'm trying to get a Curl Response from an outside server, however I noticed I cant neither PING the server in question nor connect to it. I tried disabling the iptables firewall but I had no success. My server is running behind a Cisco Linksys WRTN310N Router with the DD-wrt firmware Installed. In which I already disabled the firewall. Here are my network settings: Ifconfig eth0 Link encap:Ethernet HWaddr 00:26:b9:76:73:6b inet addr:192.168.1.120 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::226:b9ff:fe76:736b/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:49713 errors:0 dropped:0 overruns:0 frame:0 TX packets:30987 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:52829022 (52.8 MB) TX bytes:5438223 (5.4 MB) Interrupt:16 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:341 errors:0 dropped:0 overruns:0 frame:0 TX packets:341 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:27604 (27.6 KB) TX bytes:27604 (27.6 KB) /etc/resolv.conf nameserver 192.168.1.1 /etc/nsswitch.com passwd: compat group: compat shadow: compat hosts: files dns networks: files protocols: db files services: db files ethers: db files rpc: db files netgroup: nis /etc/host.conf order hosts,bind multi on /etc/hosts 127.0.0.1 localhost 127.0.0.1 callcenter # The following lines are desirable for IPv6 capable hosts ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters /etc/network/interfaces # The loopback network interface auto lo iface lo inet loopback # The primary network interface auto eth0 iface eth0 inet static address 192.168.1.120 netmask 255.255.255.0 network 192.168.1.1 broadcast 192.168.1.255 gateway 192.168.1.1 The Url to which im trying to get a connection to is https://www.veripayment.com/integration/index.php When I ping it on terminal heres what I get daniel@callcenter:~$ ping https://www.veripayment.com/integration/index.php ping: unknown host https://www.veripayment.com/integration/index.php daniel@callcenter:~$ ping www.veripayment.com PING www.veripayment.com (69.172.200.5) 56(84) bytes of data. --- www.veripayment.com ping statistics --- 2 packets transmitted, 0 received, 100% packet loss, time 1007ms PHP Function in codeigniter public function authorizePayment(){ //--------------------------------------------------- // Authorize a payment //--------------------------------------------------- // Get variables from POST array $post_str = "action=payment&business=" .urlencode($this->input->post('business')) ."&vericode=" .urlencode($this->input->post('vericode')) ."&item_name=" .urlencode($this->input->post('item_name')) ."&item_code=" .urlencode($this->input->post('item_code')) ."&quantity=" .urlencode($this->input->post('quantity')) ."&amount=" .urlencode($this->input->post('amount')) ."&cc_type=" .urlencode($this->input->post('cc_type')) ."&cc_number=" .urlencode($this->input->post('cc_number')) ."&cc_expdate=" .urlencode($this->input->post('cc_expdate_year')).urlencode($this->input->post('cc_expdate_month')) ."&cc_security_code=" .urlencode($this->input->post('cc_security_code')) ."&shipment=" .urlencode($this->input->post('shipment')) ."&first_name=" .urlencode($this->input->post('first_name')) ."&last_name=" .urlencode($this->input->post('last_name')) ."&address=" .urlencode($this->input->post('address')) ."&city=" .urlencode($this->input->post('city')) ."&state_or_province=" .urlencode($this->input->post('state_or_province')) ."&zip_or_postal_code=" .urlencode($this->input->post('zip_or_postal_code')) ."&country=" .urlencode($this->input->post('country')) ."&shipping_address=" .urlencode($this->input->post('shipping_address')) ."&shipping_city=" .urlencode($this->input->post('shipping_city')) ."&shipping_state_or_province=" .urlencode($this->input->post('shipping_state_or_province')) ."&shipping_zip_or_postal_code=".urlencode($this->input->post('shipping_zip_or_postal_code')) ."&shipping_country=" .urlencode($this->input->post('shipping_country')) ."&phone=" .urlencode($this->input->post('phone')) ."&email=" .urlencode($this->input->post('email')) ."&ip_address=" .urlencode($this->input->post('ip_address')) ."&website_unique_id=" .urlencode($this->input->post('website_unique_id')); // Send URL string via CURL $backendUrl = "https://www.veripayment.com/integration/index.php"; $this->curl->create($backendUrl); $this->curl->post($post_str); $return = $this->curl->execute(); $result = array(); // Explode array where blanks are found $resparray = explode(' ', $return); if ($resparray) { // save results into an array foreach ($resparray as $resp) { $keyvalue = explode('=', $resp); if(isset($keyvalue[1])){ $result[$keyvalue[0]] = str_replace('"', '', $keyvalue[1]); } } } return $result; } This gets an empty result array. This function however works well in the previous server where the script was hosted before. No modifications where made whatsoever Thanks in Advance

    Read the article

  • IIS 7.5 default permission - is restriction needed?

    - by Caroline Beltran
    I am using IIS 7.5 and I do not need to explicitly specify permissions for my ISAPI application to execute. Additionally, the application can create subdirectories, create and delete files without me specifying permissions. Since I am using the default permissions, checked to see if web.config was safe from prying eyes over the web, and it can’t be read which is good. My app also creates some .log and .ini files which are also not viewable over the web. I did notice that .txt files are viewable. I really don’t know how default permissions allow my app to do so much. Is this safe or do I need to lock down? To be honest, I don’t know what accounts to restrict. App details: My ISAPI has an ‘allowed’ entry in ISAPI and CGI Restrictions Folder and subfolders containing my application has ‘default’ permissions set. Application pool is using ‘classic’ pipeline mode and no managed code. Pass-through authentication in use. Thank you for your time

    Read the article

  • Local network cache of PHP and Apache2 on Win Server 2008 R2

    - by Ahmed Benlahsen
    Software configuration : I have a new server with Windows Server 2008 R2 installed via VMWare. I have installed Apache2.2, PHP5.2 and MySQL5.5 as separate packages. Issue : On my first installation of my application, all works great. When I updated some JS and CSS files and accessed my application again from a PC on local network, I got the old JS and CSS versions. When I access the same application on local server I got the latest versions of those files. Link of my application on local server is : http://localhost/BADIL Link of my application from local network is : http://LOCAL_SERVER_IP/BADIL I think that must be some cache but I don't know where. Maybe on Win Server 2008 R2 or on VMWare? The question is: Why, when I access my application on the server, everything works fine, but when I access the same application from a local network, I do not see the updated versions of JS and CSS files?

    Read the article

  • How can you make a Windows USB HDD Modify All for All Users

    - by David Allan Finch
    Hi, I use a USB HDD a lot between lots of different Windows Boxes. What I find after a while is that there get to be lots of different Permission on the files in some cases stopping me looking at files or removing them. They want Admin rights or even sometimes you need to put the disk back into the original machine with the original user. This is a right pain. Is there away of making the disk have Modify All for All Users and making this the default for all files on the disk. Thanks

    Read the article

  • OS X Lion: Emptying the trash takes "forever": is using rm -r safe?

    - by EOL
    Emptying my Trash in OS X Lion (non securely) is taking about three hours (about 1.5 million files, from a Time Machine backup). I had to stop the process a few times already, because I could not move my laptop with the external harddrive the files are on. This is also a problem because the Trash emptying is restarted from the very beginning each time I empty the Trash again (i.e., files are not deleted when the Trash emptying is aborted). I read that it is faster to use rm -rf on ~/.Trash, in this case. However, is this safe? (I am afraid that does OS X Lion performs tasks behind the scenes—which would explain its slowness—that rm -r does not, which could lead to problems in the future.)

    Read the article

  • In SSIS Convert European Currency Format to United States Currency Format

    - by Rob
    I have an interesting problem. I have an SSIS package that processes account data. We are now processing files from Europe. These files are in a CSV format using text qualifiers. For an example of the problem: In the United States the currency format is 123456.99 (We purposely leave the thousands separator out). The files sent from Europe are coming in with two formats. One is 123456,99 and the other is 123.456,00. SSIS is attempting to parse the text file and place it into a NUMERIC(20,2) field. This causes a parsing error in SSIS even with the text qualifiers. If I change the field to CURRENCY it sends a conversion error. I would like for SSIS to deal with this directly without requiring the data to be in the United States format. Has anyone had this problem? Any help will be greatly appreciated. Rob

    Read the article

  • Converting a bunch of MP3s to mono?

    - by Wil
    I have a bunch of stereo MP3s I'd like to convert to mono. What is the best way to do this? I would prefer something that would let be batch process them. I want to keep the quality as close to the original as possible. My files are also in different bitrates, so I don't want to make all files 320kpbs when some are only 128. Also, is there any quick way to see which files are stereo out of my entire library?

    Read the article

  • WYSIWYG editor for structured text (suitable for SVN versioning)

    - by chris_l
    I'm looking for an open source cross platform WYSIWYG editor that I can use to write documentation. I'm not looking for a web based solution - i.e. it should work without a web server, and I want to save my files directly to disk. The result could be any structured format, like Wiki markup, ReStructuredText, DocBook, or a small subset of HTML, ... But it's important, that Subversion diff can be used to see differences between the versions easily (this wouldn't work with .odt or .rtf files for example) I'm currently thinking about using Open Office, and saving the files as HTML, but is there a better solution?

    Read the article

  • How to quickly empty a very full recycle bin?

    - by Pekka
    I have deleted about half a million files from a folder, and didn't think to press Shift in order to delete them completely straight away. Now they're clogging my recycle bin, and Windows claims it will take 4 hours to empty it - it claims to do about 68 files per second. Is there some magic or an alternative method that can speed this up? Bounty - I'm starting a bounty. The files are still in my bin, as there was no pressing need to get rid of them and this way, I can try out the suggestions presented. I am, however, looking for a way that does not include hard-deleting the contents of the RECYCLER folder - I'm sure that would work, but it feels a bit unclean to me.

    Read the article

  • afp/smb transfers caps at 2 megabytes/sec, wireless N

    - by CQM
    I wanted to transfer files between two mac computers. The network is wireless-N and both computers have wireless-N modules in them. The problem is that when I transfer files between them, via file sharing (afp) the network speed caps at 2 megabytes/sec. Just downloading files from the internet I can get faster speeds, so this isn't a constriction of my wifi bandwidth, it appears to be a constriction of the protocol being used. My wifi-n is set to 130mbits, so I should see real world transfer speeds around 12-16 megabytes/sec I did this command on both computers sudo sysctl -w net.inet.tcp.delayed_ack=0 which is supposed to lower tcp overhead, but this did not affect it. How can I get the speed I am expecting?

    Read the article

  • HP Recovery Manager failed creating a backup disc

    - by Baehr
    I had to restore my computer (running Windows 7) back to its factory default, so I ran the HP recovery manager disc and chose to reformat my computer. Before doing this I was given an option to backup my current files to a DVD, and I did so; getting the confirmation of a successful DVD burn. Booting up my computer everything looks like it did when I first purchased it (like expected). But when I pop in the backup DVD and run the recovery EXE inside it, the application creates a directory inside C:\System Recovery Files which contains the empty directories: [(^_^)] and Program Data. So essentially what I'm getting out of this is a failure in creating a back-up. Which is not only confusing but incredibly frusterating. Is there any way that I can recover the files lost in the

    Read the article

  • File association for editing on a mac

    - by Agos
    I'm quite experienced with how file association works for opening files on Mac OS X. I recall reading somewhere that OS X keeps not only the information about which apps can open a file, but also which apps can edit a specific file type. I'm having problems with those applications (Coda, Espresso, Forklift, Flow) that have an “edit with external editor” feature, since issuing this command on HTML files opens them with Dashcode. Dashcode of course is not the current association for opening these files (Safari is), so it's clearly looking for apps that can edit HTML. Since I'd like to use TextMate as my editor in these cases, how can I set this preference?

    Read the article

  • tar incremental backup is backing everything up, every time when used on the Dropbox directory

    - by Cyclic
    I made an incremental backup about 10 months ago (on Jan 27, 2013), creating a .snar metadata file. Now, when I try to make an incremental backup using tar --create --file=dropbox_incremental_1.tar --listed-incremental=dropbox_0.snar Dropbox the command just re-backs up everything. I'm not an expert at Unix timestamps, but I noticed that virtually all of my directory timestamps are way more recent than the last time they changed. For my actual files, they look like this: Access: 2013-03-12 19:04:51.000000000 -0500 Modify: 2012-09-30 15:10:47.000000000 -0500 Change: 2013-03-12 19:04:51.306209672 -0500 The 'Modify' timestamp seems correct, but the files were definitely not changed (at least not doing anything that I know of) at the time they say they were. These files still seem to go into the incremental archive. What's happening here? Is there a way to tell tar to look at the 'modify' timestamp? Isn't this what it's supposed to be doing?

    Read the article

  • If I split my harddrive, will a system restore delete it?

    - by AoiHana
    I split my 1TB HD from C drive and called it A. I put some files there to back up. I have now come to a point where I need to run a system restore with the discs that came with my computer. If I do a system restore and wipe C drive clean, will I lose the part of the hard drive that I split or will it remain intact and my files uncorrupted? This is a partition I created with windows myself, and dragged files over to it. Thanks!

    Read the article

  • Awstats messaging non existant user causing exim4 to go nuts

    - by Chris
    I've taken over managing a server set up by someone else now uncontactable, while managing to work out most faults / changes needed this one is stumping me. Awstats is running on the machine and sending messages via exim4 to a user every time it runs an update. The user account has been deleted and so the exim4 main log files are filling up with message delivery errors, which firstly hinders meaningful log analysis for anything else and secondly uses up quite a lot of space (it grew to 22GB unattended, panic!) I've been through all the conf files in /etc/awstats and can't seem to find any mention of this user account. Google just turns up results about how to use awstats to parse exim4 log files. So the questions is where is this setting (on debian) likely to be? Cheers in advance

    Read the article

  • How to restore missing space in NTFS file systems

    - by jacobsee
    I have a 40 GB USB hard drive formatted with NTFS on a PC running Windows XP Pro, SP3. I am trying to free as much space as possible. Windows Explorer tells me that I have about 200 MB of files on the drive (showing hidden and system files). When I show drive properties however it shows 73% free, around 10 GB used. I ran CHKDSK and it found all kinds of problems. Now running defrag and it is behaving as if there were 10 GB of files, but I can't access them anywhere. How to find and remove this extra 10GB?

    Read the article

  • Difficulty in running Tomcat v7.0 with Eclipse Juno

    - by user1673718
    I get the following error when I run my JSP file in Eclipse-Juno with Tomcat v7: 'starting Tomcat v7.0 server at localhost' has encountered a problem. Port 8080 required by Tomcat v7.0 server at localhost is already in use. The server may already be running in another process, or a system process may be using the port. To start this server you will need to stop the other process or change the port number(s). I have Oracle 10g installed in my System. When I type "http://localhost:8080" it opens the Oracle 10g license agreement so I think Oracle 10g is already running in that port. To change the port of Tomcat I tried Google, which said to change the port in the "C:\Program Files\Apache Software Foundation\Apache Tomcat 7.0.14\conf\httpd.conf" file But at "C:\Program Files\Apache Software Foundation\Apache Tomcat 7.0.14\conf" there was no httpd.conf file. I only have "catalina.policy,catalina.properties,context,logging.properties,server,tomcat-users,web" files in that conf folder. I use windows XP.

    Read the article

  • dvd burning software for ubuntu

    - by user23950
    I've tried brasero which is the default software for burning data in ubuntu. But it seems to be malfunctioning. I thought it can only burn image files. And it cannot distinguish what file you are burning. I'm burning .avi files to be burned as a data. But I can only see the option burn as an image file. Do you know of any other software for ubuntu that can burn files as a data?

    Read the article

  • Spotlight Infinite Indexing issue (external data drive)

    - by Manca Weeks
    This is an external drive, formerly a boot drive which is now in use only to access music files (sibelius, audio, midi, live, logic etc.) without transferring the data into a new boot system, partly because of the issue I am about to describe, but mostly because the majority of the data is mainly there for archival purposes. The user is a composer and prominent musician and needs to be able to rehash the data at will. I have tried several things - here is a list: - make complete filesystem clone with antonio diaz's ddrescue - run Disk Warrior on copy, repair whatever errors occurred - wipe out all ACLs on entire drive - set all permissions to the same value - wide open 777 - remove any system data (applications, system files, including hidden files to the best of my knowledge) by selecting only non-system/app data and using Carbon Copy Cloner to put only the data of interest onto a newly formatted drive - transfer data to newly formatted drive folder by folder, resetting the spotlight index in between adding each to observe for issues (interesting here is that no issues occurred except for in Documents folder - when I transferred only the Documents folder to a newly formatted drive on its own - no trouble. It appears almost as thought it may not be the content but the quantity or specific combination of data that results in problems) - use DataRescue to transfer the data to yet another newly formatted drive to expose any missed hidden files Between each of the above steps I stopped Spotlight (search for anything beginning with md in Activity Monitor - All Processes and quitting it), deleted the .Spotlight-V100 directory from the affected drive. Restart Splotlight indexing by adding drive to Spotlight privacy list and removing it. In each case the same issue occurs - Spotlight begins indexing normally (or so it seems), then the index estimated time increases, usually to 4 hours remaining. This is where it gets stuck and continues to predict 4 hours remaining but never finishes. Sometimes I can't eject the drive and have to quit the md.. processes from Activity Monitor to be able to eject the drive without Force Eject. Once I disconnect the drive after the 4 hours remaining situation - if I reattach it, Spotlight forever estimates remaining time and never gets going again. So there it is. It is apparently not a filesystem issue, not a permissions issue and not tied to any particular piece of hardware or protocol (used USB and FW drives). I have tried this on several machines (3 to be precise) and in 10.5.8 and 10.6.5. Simply disabling Spotlight on this volume is not an option because the owner has no clue where things are as the data on the volume dates back to music projects and compositions from 2003 and before. He needs to be able to query for results. Anyone got any ideas? ---update 2-6-11 Since I have not received any responses except the one below which appears to misunderstand my point, I am updating this post hoping to get more responses. I have used the terminal command sudo opensnoop -p PID where PID is the mdworker process ID to try and determine what Spotlight is doing and hopefully find the files it's having trouble with. Here's what happens: After indexing for a few hours, mdworker is gone. It no longer shows up in Activity Monitor under "All Processes" and the Terminal window with the opensnoop result stops moving. I then proceeded to execute the same command on mds to see what it was doing and here's what I get, repeatedly: 501 57 mds 21 / 501 57 mds 21 /Volumes/Sno Leppard 501 57 mds 21 /Volumes/Tiger 501 57 mds 21 /Volumes/Leppard 501 57 mds 21 /Volumes/Disk Warrior 501 57 mds 21 /Volumes/ONM Data These represent all the volumes currently mounted in the system. All except ONM Data, which is the one I am trying to index, are excluded from SPotlight indexing at the moment. The sequence above repeats over and over, with slight variation, sometimes skipping one of the volumes. Questions - what happened to mdworker? What is mds doing? I will let this run until tomorrow morning and throughout the day and monitor for any changes. Any input would be very much appreciated. Even if you're not sure what the ultimate answer is, please alert me to anything you think I may be missing. Hopefully at some point we will figure this out... Thanks, M __final edit__ I finally resolved the issue and here is how I did it. I used the terminal command "sudo opensnoop -p PID" where the PID is the process id of the processes I was monitoring. I was looking at all instances of mds and mdworker running in the system. After the third time through indexing the same data set (see info above), I contacted Apple and got to their highest level of support - they were flabbergasted as well. They advised me to install yet another default 10.6.6 system and try again. The same pattern repeated - mds and mdworker(s) would start indexing and eventually the spotlight icon would say 6 hours remaining and all mdworkers were gone, mds at 90% or so of CPU. But I did finally figure out that the first time mdworker stopped like that, the last file it touched was always in the same folder. I excluded that folder from spotlight search and the rest of the data set indexed within about 2 hours with no strange behavior or failures. I copied that folder to another machine and Spotlight barfed immediately. Exclude that folder and all is well again. I have no clue what is causing this behavior, still, but I did find a functional solution to the problem. Anyone with a similar problem - run opensnoop on all instances of mds and mdworker and wait patiently for wdworker to exit. Look at the last file it touched and exclude the enclosing folder from being indexed. I was able to repeat the issue and solution on 2 different installs and 2 different copies of the data set. Hope this helps. If we find an actual cause of the folder being such a problem (it is called MICHAEL BRECKER RECORD SOLOS and contains almost 1 GB of audio related files - performer, live, SD2 - things like that), I will edit again to let you all know. Thanks for ay attempts to help, M

    Read the article

  • Plugin 'InnoDB' registration as a STORAGE ENGINE failed. On win 7

    - by NimChimpsky
    I have had to reinstall MySQL, however the service is failing to start with the above cause listed in evnt viewer. One solution is apparently to delete a couple of files prefixed with 'ib_logfile' which represent any old databases. However I do not have these files, and my service is still failing to start ... ? When I say I don't have these files I did a search using the windows search with zero results, and they are definitely not present in my mysql install directory. And I don't have the "documents and setting/appilcation data' folder referenced in link. In fact I only have only one mysql install directory, I know where that is - what do I need to delete/change ? The instance is configured OK, I ran that as administrator and it is listed in services, but the service itself fails to start Any tips, other than going over to postgresql ?

    Read the article

< Previous Page | 482 483 484 485 486 487 488 489 490 491 492 493  | Next Page >