Search Results

Search found 31670 results on 1267 pages for 'php fpm'.

Page 561/1267 | < Previous Page | 557 558 559 560 561 562 563 564 565 566 567 568  | Next Page >

  • Upload file onto Server from the IPhone using ASIHTTPRequest

    - by Nick
    I've been trying to upload a file (login.zip) using the ASIHTTPRequest libraries from the IPhone onto the inbuilt Apache Server in Mac OS X Snow Leopard. My code is: NSString *urlAddress = [[[NSString alloc] initWithString:self.uploadField.text]autorelease]; NSURL *url = [NSURL URLWithString:urlAddress]; ASIFormDataRequest *request; NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *dataPath = [documentsDirectory stringByAppendingPathComponent:@"login.zip"]; NSData *data = [[[NSData alloc] initWithContentsOfFile:dataPath] autorelease]; request = [[[ASIFormDataRequest alloc] initWithURL:url] autorelease]; [request setPostValue:@"login.zip" forKey:@"file"]; [request setData:data forKey:@"file"]; [request setUploadProgressDelegate:uploadProgress]; [request setShowAccurateProgress:YES]; [request setDelegate:self]; [request startAsynchronous]; The php code is : <?php $target = "upload/"; $target = $target . basename( $_FILES['uploaded']['name']) ; $ok=1; if(move_uploaded_file($_FILES['uploaded']['tmp_name'], $target)) { echo "The file ". basename( $_FILES['uploadedfile']['name']). " has been uploaded"; } else { echo "Sorry, there was a problem uploading your file."; } ?> I don't quite understand why the file is not uploading. If anyone could help me. I've stuck on this for 5 days straight. Thanks in advance Nik

    Read the article

  • mysql_affected_rows() returns 0 for UPDATE statement even when an update actually happens

    - by Alex Moore
    I am trying to get the number of rows affected in a simple mysql update query. However, when I run this code below, PHP's mysql_affected_rows() always equals 0. No matter if foo=1 already (in which case the function should correctly return 0, since no rows were changed), or if foo currently equals some other integer (in which case the function should return 1). $updateQuery = "UPDATE myTable SET foo=1 WHERE bar=2"; mysql_query($updateQuery); if (mysql_affected_rows() > 0) { echo "affected!"; } else { echo "not affected"; // always prints not affected } The UPDATE statement itself works. The INT gets changed in my database. I have also double-checked that the database connection isn't being closed beforehand or anything funky. Keep in mind, mysql_affected_rows doesn't necessarily require you to pass a connection link identifier, though I've tried that too. Details on the function: mysql_affected_rows Any ideas? SOLUTION The part I didn't mention turned out to be the cause of my woes here. This PHP file was being called ten times consecutively in an AJAX call, though I was only looking at the value returned on the last call, ie. a big fat 0. My apologies!

    Read the article

  • Complex Quary on cassandra

    - by Sadiqur Rahman
    I have heard on cassandra database engine few days ago and searching for a good documentation on it. after studying on cassandra I got cassandra is more scalable than other data engine. I also read on Amazon SimpleDB but as SimpleDB has a limitation 10GB/table and Google Datastore is slower than Amazon SimpleDB, I prefer not to use them (Google Datastore, Amazon SimpleDB). So for making our site scaled specially high write rates with massive data, I like to use Cassandra as out Data Engine. But before starting using cassandra I am confused on "How to handle complex data using casssandra". I am giving you the MySQL database structure below, Please read this and give me a good suggestion. Users Table hasColum ID Primary hasColum email Unique hasColum FirstName hasColum LastName Category Table hasColum ID Primary hasColum Parent hasColum Category Posts Table hasColum ID Primary hasColum UID Index foreign key linked to users-ID hasColum CID Index foreign key linked to Category-ID hasColum Title hasColum Post Index hasColum PunDate Comments hasColum ID primary hasColum UID Index foreign key linked to users-ID hasColum PID Index foreign key linked to Posts-ID hasColum Comment User Group hasColum ID primary hasColum Name UserToGroup Table (for many to many relation only) hasColum UID foreign key linked to Users-ID hasColum GID foreign key linked to Group-ID Finally for your information, I like to use SimpleCassie PHP Class http://code.google.com/p/simpletools-php/ So, it will be very helpful if you can give me example using SimpleCassie

    Read the article

  • Toolbox/framework to construct lightweight public-facing web site

    - by aSteve
    I am aware of full-blown content management systems (CMS) such as SugarCRM and TikiWiki... where content is typically stored in a database... and edited through the same interface as it is published. While I like many of the features, the product is clearly aimed at enterprise-wide use rather than to be public-facing. What I'd like to establish are potential alternatives that fill the space between full-blown CMS and hand-coded bespoke site. I like the way that I can add modules to my CMS... allowing me to quickly introduce new functionality, and I'd like an analogous feature in a system for public web-content. Modules I know I'd like include moderated comments; web-form-to-email gateway; menus/tabs... in future, perhaps mapping or diaries or RSS integration - etc. Where my requirements differ from a CMS, I don't need (or want) most content to be editable through the main site... and, somehow, I do want to be able to preview how updates will be presented to the public rather than to make live changes. For these purposes, in contrast to those where a typical CMS would be ideal, presentation is of paramount importance - and trumps any desire to immediately disseminate information. I realise that this is a very high-level question... (suggestions of additional tags welcome) - I mentioned PHP only as - ideally - I'm looking for an open source solution and a PHP deployment is an easy option. What are my options?

    Read the article

  • Android: Having trouble getting html from webpage

    - by Kyle
    Hi, I'm writing an android application that is supposed to get the html from a php page and use the parsed data from thepage. I've searched for this issue on here, and ended up using some code from an example another poster put up. Here is my code so far: HttpClient client = new DefaultHttpClient(); HttpGet request = new HttpGet(url); try { Log.d("first","first"); HttpResponse response = client.execute(request); String html = ""; Log.d("second","second"); InputStream in = response.getEntity().getContent(); Log.d("third","third"); BufferedReader reader = new BufferedReader(new InputStreamReader(in)); Log.d("fourth","fourth"); StringBuilder str = new StringBuilder(); String line = null; Log.d("fifth","fifth"); while((line = reader.readLine()) != null) { Log.d("request line",line); } in.close(); } catch (ClientProtocolException e) { } catch (IOException e) { // TODO Auto-generated catch block Log.d("error", "error"); } Log.d("end","end"); } Like I said before, the url is a php page. Whenever I run this code, it prints out the first first message, but then prints out the error error message and then finally the end end message. I've tried modifying the headers, but I've had no luck with it. Any help would be greatly appreciated as I don't know what I'm doing wrong. Thanks!

    Read the article

  • optimistic and pessimistic locks

    - by billmce
    Working on my first php/Codeigniter project and I’ve scoured the ‘net for information on locking access to editing data and haven’t found very much information. I expect it to be a fairly regular occurrence for 2 users to attempt to edit the same form simultaneously. My experience (in the stateful world of BBx, filePro, and other RAD apps) is that the data being edited is locked using a pessimistic lock—one user has access to the edit form at the time. The second user basically has to wait for the first to finish. I understand this can be done using Ajax sending XMLHttpRequests to maintain a ‘lock’ database. The php world, lacking state, seems to prefer optimistic locking. If I understand it correctly it works like this: both users get to access the data and they each record a ‘before changes’ version of the data. Before saving their changes, the data is once again retrieved and compared the ‘before changes’ version. If the two versions are identical then the users changes are written. If they are different; the user is shown what has changed since he/she started editing and some mechanism is added to resolve the differences—or the user is shown a ‘Sorry, try again’ message. I’m interested in any experience people here have had with implementing both pessimistic and optimistic locking. If there are any libraries, tools, or ‘how-to’s available I’m appreciate a link. Thanks

    Read the article

  • Process data BEFORE a 301 Redirect?

    - by Jesse
    So, I've been working on a PHP link shortener (I know, just what the world needs). Basically when the page loads, php determines where it needs to go and sends a 301 Header to redirect the browser, like so... Header( "HTTP/1.1 301 Moved Permanently" ); header("Location: http://newsite.com"; Now, I'm trying to add some tracking to my redirects and insert some custom analytics data into a MySQL table before the redirect happen. It works perfectly if I don't specify the a redirect type and just use: header("Location: http://newsite.com"; But, of course as soon as you add in the 301 header, nothing else gets processed. Actually, on the first request, it sends the data to MySQL, but on any subsequent requests there's no communication with the database. I assume it's a browser caching issue, once it's seen the 301 it decides they're no reason to parse anything on future requests. But, does anyone know if there's any way to get around this? I'd really like to keep it as a 301 for SEO purposes (I believe if you don't specify it sends a 404 by default?). I thought about using .htaccess to prepend a file to the page that will do the MySQL work, but with the 301, wouldn't that just get ignored as well? Anyway, I'm not sure if there's any solution other than using a different type of redirect, but I'm ready to give up just yet. So, any suggestions would be much appreciated. Thanks!

    Read the article

  • how to get contents of site use HTTPS

    - by cashmoney
    ex of site using ssl ( HTTPs ) : https://www.eb2a.com 1 - i tried to get its content using file_get_contents, but not work and give error ex : <?php $contents = file_get_contents("https://www.eb2a.com/"); echo $contents; ?> 2 - i tried to use fopen, but not work and give error ex: <?php $url = 'https://www.eb2a.com/'; $contents = fopen($url, 'r'); echo "$contents"; ?> 3 - i tried to use CURL, but not work and give BLANK PAGE ex : function cURL($url, $ref, $header, $cookie, $p){ $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0); curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']); curl_setopt($ch, CURLOPT_REFERER, $ref); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0); if ($p) { curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST"); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_POSTFIELDS, $p); } $result = curl_exec($ch); curl_close($ch); if ($result){ return $result; }else{ return ''; } } $file = cURL('https://www.eb2a.com/','https://www.eb2a.com/',0,0,null); echo $file any one have any idea ??

    Read the article

  • Problem with Free SMTP Server in php code

    - by Nuha
    I wrote php code for use to contact us form but I can not find free SMTP server to use it. I Try to use SMTP Server For Gmail but I found this error. Warning: mail() [function.mail]: "sendmail_from" not set in php.ini or custom "From:" header missing in C:\www\htdocs\contactUs.php on line 25" line 25 is : mail ($to,$subject,$body,$headers); statement that indicate using Gmail SMTP Server is : ini_set("SMTP","smtp.gmail.com"); SO,can U help me ?:(

    Read the article

  • Smart way to execute function in flash from an imported html (or XML) plain text

    - by DomingoSL
    Ok, here is the think. I have a very simple flash program code in AS3, inside this program there is a dynamic textfield html capable. Also i have a database were i will put some information. Each element in the database can be linked to other. For example, this database will contain tourist spots, and they can be related with others in the same database. Ramdom place 1 This interesting spot is near <link to="ramdo2">ramdom place2</link>. And so, so so... The information in the database is retrived by the flash application and it shows the text on the dynamic textfield. I need somehow to tell my flash application that have to be a link to other part of the database, so when the user click the link a function in flash call this new data. The database is a simple Mysql server, and the data is not yet in there, waiting for your suggestions of how to format it and develop the solution. Im a php developer too, so i can make a gateway in PHP to read the MySQL and then retrive some format to flash, an XML for example.

    Read the article

  • Browser timing out attempting to load images

    - by notJim
    I've got a page on a webapp that has about 13 images that are generated by my application, which is written in the Kohana PHP framework. The images are actually graphs. They are cached so they are only generated once, but the first time the user visits the page, and the images all have to be generated, about half of the images don't load in the browser. Once the page has been requested once and images are cached, they all load successfully. Doing some ad-hoc testing, if I load an individual image in the browser, it takes from 450-700 ms to load with an empty cache (I checked this using Google Chrome's resource tracking feature). For reference, it takes around 90-150 ms to load a cached image. Even if the image cache is empty, I have the data and some of the application's startup tasks cached, so that after the first request, none of that data needs to be fetched. My questions are: Why are the images failing to load? It seems like the browser just decides not to download the image after a certain point, rather than waiting for them all to finish loading. What can I do to get them to load the first time, with an empty cache? Obviously one option is to decrease the load times, and I could figure out how to do that by profiling the app, but are there other options? As I mentioned, the app is in the Kohana PHP framework, and it's running on Apache. As an aside, I've solved this problem for now by fetching the page as soon as the data is available (it comes from a batch process), so that the images are always cached by the time the user sees them. That feels like a kludgey solution to me, though, and I'm curious about what's actually going on.

    Read the article

  • Linux apache developing configuration

    - by Jeffrey Vandenborne
    Recenly reinstalled my system, and came to a point where I need apache and php. I've been searching a long time, but I can't figure out how to configure apache the best way for a developer computer. The plan is simple, I want to install apache 2 + mysql server so I can develop some php website. I don't want to install lamp though, just the apache2, php5 and mysql. The problem that I've been looking an answer for is the permissions on the /var/www/ folder. I've tried making it my folder using the chown command, followed by a chmod -R 755 /var/www. Most things work then, but fwrite for example won't work, because I need to give write permissions to everyone, unless I change my global umask to 000 I'm not sure what I can do. In short: I want to install apache2, php5, mysql-server without using lamp, but configured in a way so I can open up netbeans, start a project with root in /var/www/, and run every single function without permission faults. Does anyone have experiences or workarounds to this? Extra: OS: Ubuntu 10.04 ARCH: x86_64

    Read the article

  • $_SESSION v. $_COOKIE

    - by taeja87
    I learned about $_SESSION about several weeks ago when creating a login page. I can successfully login and use it with variables. Currently I am trying to understand $_SESSION and $_COOKIE. Please correct me if I am wrong, I can use $_SESSION when logging in and moving around pages. With $_COOKIE, it is used to remember when I last visit and preferences. Another thing involving cookies is that when websites use advertisements (for example: Google AdSense), they use the cookies to track when visitor click on a advertisement, right? I can use both ($_SESSION & $_COOKIE)? I read somewhere that you can store the session_id as value for the cookie. Also, I read about security which let to me finding this: What do I need to store in the php session when user logged in?. Is using session_regenerate_id good for when a user comes back to the site? And this: How to store a cookie with php involving uniqid. For those wanting to know about the login, I use email and password. That way the user can be able to change their username. I look forward to learning more about these two from anybody who would like to share their knowledge about it. If I asked too many question, you can just answer the one that you have more experience with. If you need more information, just ask since I might have forgotten to include something. Thank You. Found this: What risks should I be aware of before allowing advertisements being placed on my website?

    Read the article

  • Adding extra data to a Variable

    - by DogPooOnYourShoe
    Right now, my code plucks out only one value using Mysql. So I thought I might aswell add each found result to a variable, however I dont know how to do this. This must be a very basic question, but I cant find a answer for it echo '<table border="1">'; echo "<tr><td><b>Surname</b></td><td><b>Title/Name</b></td><td><b>Numbers</b></td><td><b>Telephone</b></td><td><b>Edit</b></td><td><b>Del</b></td></tr>\n"; while ($row= mysql_fetch_array($result)) { $Surname = $row["Surname"]; $Title = $row["TitleName"]; $Email = $row["Email"]; $Telephone = $row["Telephone"]; $id = $row["id"]; $MooringNumbers = $row['Number']; $Assignedto['AssignedTo']; } $MooringQuery = "select * FROM mooring WHERE AssignedTo='$id'"; $MooringResult = mysql_query($MooringQuery) or die("Couldn't execute query"); while ($row1= mysql_fetch_array($MooringResult)) { $AssignedTo = $row1["AssignedTo"]; $MooringNumbers = $row1["Number"]; echo '<tr><td>' .$Surname.'</td><td>'.$Title.'</td><td>'.$MooringNumbers . '</td><td>'.$Telephone.'</td><td>' . '<a href="rlayCustomerUpdtForm.php?id='.$id.'">[EDIT]</a></td>'.'<td>'. '<a href="deleteCustomer.php?id='.$id.'">[x]</a></td>'. '</tr>'; }

    Read the article

  • General web service ideas

    - by user2014175
    I have a question regarding different types of web services. I'll preface this by saying that I have built a number of apps (for both ios and android) for personal use that interact with the web via php and sql. I have taught myself these languages, and as such don't have the broader background knowledge that many of you do. My question is, in what other ways can you perform an interaction between a web service and a mobile device other than mobile - php - sql - etc. For example, If I built a very simple tracking app for my car, my current method would be to push GPS coordinates from my iphone to my database at a set interval, then I would write a simple bit of javascript that pulled those coordinates out of the database and superimposed them on a google map. Is there a different way to do this? Such as the server acting as a live middle man who simple pushed the coordinates directly to a target browser? Without the database in the middle? If so, are there advantages and disadvantages to these different methods to achieve different goals? I know its a broad question but I'm really intrigued and I'm finding it difficult to word a google search for it. Any info / reading material suggesting would be excellent. Thanks

    Read the article

  • Best solution for __autoload

    - by tpk
    As our PHP5 OO application grew (in both size and traffic), we decided to revisit the __autoload() strategy. We always name the file by the class definition it contains, so class Customer would be contained within Customer.php. We used to list the directories in which a file can potentially exist, until the right .php file was found. This is quite inefficient, because you're potentially going through a number of directories which you don't need to, and doing so on every request (thus, making loads of stat() calls). Solutions that come to my mind... -use a naming convention that dictates the directory name (similar to PEAR). Disadvantages: doesn't scale too great, resulting in horrible class names. -come up with some kind of pre-built array of the locations (propel does this for its __autoload). Disadvantage: requires a rebuild before any deploy of new code. -build the array "on the fly" and cache it. This seems to be the best solution, as it allows for any class names and directory structure you want, and is fully flexible in that new files just get added to the list. The concerns are: where to store it and what about deleted/moved files. For storage we chose APC, as it doesn't have the disk I/O overhead. With regards to file deletes, it doesn't matter, as you probably don't wanna require them anywhere anyway. As to moves... that's unresolved (we ignore it as historically it didn't happen very often for us). Any other solutions?

    Read the article

  • Regularly update database without browser/user

    - by Chris M
    I currently have a MySQL database which I was hoping to use to store regularly updated data from a temperature sensor connected to the internet. I currently have a page that, when opened, will grab the current temperature and the current timestamp and add it as an entry to the database, but I was looking for a way to do that without me refreshing the page every 5 seconds. Detail: The data comes from an Arduino Ethernet, posted to an IP address. Currently, I'm using cURL to grab the data from the IP, add a timestamp and save it to the DB. Obviously only updates when the page is refreshed (it uses PHP). Here is a live feed of the data - http://wetdreams.org.uk/ChrisProject/UI/live_graph_two.html TL;DR - Basically I need a middle man to grab the data from the IP and post it to a MySQL Edit: Thanks for all the advice. There might be a little bit of confusion, I'm looking for a solution that (ideally) doesn't require a computer to be on at all (other than the Server containing Database). Since I'm looking to store data over long periods of time (weeks), I'd like to set it up and leave a script running on the server (or Arduino) that gets the temp and posts it to the Database. In my head I would like to have a page on the server that automatically (without any browser open, or any other prompting other than a timer) calls a PHP script. Hope that clears things up!

    Read the article

  • Mysql Query is not working in edited jTable code, why?

    - by Furkan Kadioglu
    I'm using this example: www.jtable.org I've downloaded the jTable PHP version. I then edited the script. The jTable simple version is working, but my edited version isn't. I can create a list, but I can't add a row; this code is causing problems. However, PHP doesn't display any error messages. else if($_GET["action"] == "create") { //Insert record into database $result = mysql_query("INSERT INTO veriler(bolge, sehir, firma, adres, tel, web) VALUES('" . $_POST["bolge"] . "', '" . $_POST["sehir"] . "', '" . $_POST["firma"] . "', '" . $_POST["adres"] . "', '" . $_POST["tel"] . "', '" . $_POST["web"] . "'"); //Get last inserted record (to return to jTable) $result = mysql_query("SELECT * FROM veriler WHERE id = LAST_INSERT_ID();"); $row = mysql_fetch_array($result); //Return result to jTable $jTableResult = array(); $jTableResult['Result'] = "OK"; $jTableResult['Record'] = $row; print json_encode($jTableResult); } What is the problem?

    Read the article

  • CakePHP dropping session between pages

    - by DavidYell
    Hi, I have an application with multiple regions and various incoming links. The premise, well it worked before, is that in the app_controller, I break out these incoming links and set them in the session. So I have a huge beforeFilter() in my *app_controller* which catches these and sets two variables in the session. Viewing.region and Search.engine, no problem. The problem arises that the session does not seem to be persistant across page requests. So for example, going to /reviews/write (userReviews/add) should have a session available which was set when the user arrived at the site. Although it seems to have vanished! It would appear that unless $this-params is caught explicitly in the *app_controller* and a session variable written, it does not exist on other pages. So far I have tried, swapping between storing session in 'cake' and 'php' both seem to exhibit the same behaviour. I use 'php' as a default. My Session.timeout is '120', Session.checkAgent is False and Security.level is 'low'. All of which should give enough leniency to the framework to allow sessions the most room to live! I'm a bit stumped as to why the session seems to be either recreated or blanked when a new page is being requested. I have commented out the requestAction() calls to make sure that isn't confusing the session request object also, which doesn't seem to make a difference. Any help would be great, as I don't have to have to recode the site to pass all the various variables via parameters in the url, as that would suck, and it's worked before, thus switching on $this-Session-read('Viewing.region') in all my code!

    Read the article

  • How to retrieve content via .load() or $.get() with this line

    - by Sin
    hello :) I posted a question a day or two ago about how to retrieve php via ajax method in this modal I was using. I kinda found out the right way to go about it, but there's still something I'm not doing right (obviously lol) Here's the section thats giving me the issues: jQuery('div that holds content').fadeIn(200).css({ 'width': Number( popWidth ) }); $('').load('/something/somewhere/this #content'); So, im using safari, and a local server (mamp), when I check activity in my browser, it shows that it is loading the content with every click, AND the pop up pops up, but no content. When I simply retrieve content via hidden div, ofcourse, i get it. This is what I'm trying to avoid. right now I have that div in my footer stashed as hidden. I'd rather just make a call when its needed, instead of loading it every single time a page is accessed. you can go here to see the whole script i posted in my last question: How to use ajax to show php in a modal pop up Anyone have any idea? I read that .load() has the ability to grab specific content from a request, but im not sure the major difference between that and $.get() I've tried both, and I get the same results. Im using wordpress, and wordpress's ajax requests run smooth as ever, so I know its not a local problem, i'ts my coding lol Ok....Im done typing :)

    Read the article

  • htaccess change DirectoryIndex priotiry to php and not html

    - by Jayapal Chandran
    In a production server there are index.html and index.php By default index.html is getting loaded. I want index.php to be the default script to load and if index.php is not present then index.html can load. It is a shared hosting so we do not have access to the httpd.conf file So i thought of creating .htaccess file which would do the above condition. What is the directive to include in .htaccess file to do so?

    Read the article

  • Send 404 when requesting index.php through .htaccess?

    - by Daniel
    I've recently refactored an existing CodeIgniter application to use url segments instead of query strings, and I'm using a rewriterule in htaccess to rewrite stuff to index.php: RewriteRule ^(.*)$ /index.php/$1 [L] My problem right now is that a lot of this website's pages are indexed by google with a link to index.php. Since I made the change to use url segments instead, I don't care about these google results anymore and I want to send a 404 (no need to use 301 Move permanently, there have been enough changes, it'll just have to recrawl everything). To get to the point: How do I redirect requests to /index.php?whatever to a 404 page? I was thinking of rewriting to a non-existent file that would cause apache to send a 404. Would this be an acceptable solution? How would the rewriterule for that look like? edit: Currently, existing google results will just cause the following error: An Error Was Encountered The URI you submitted has disallowed characters.

    Read the article

  • MySQL problem reconnecting (mysqld.exe) keeps giving error...

    - by Ronedog
    Need some guidance figuring out what went wrong. I've been using mysql, phpmyadmin for just under a year on my home computer while I develop a webapp. 3 days ago I updated my windows vista with all the "wonderful" microsoft updates, security patches, etc...and now it's broke. I tried uninstalling all the upgrades, but there are 4 of them I can't unistall because microsoft says their "operating system" updates and can't be unistalled. My system is: windows vista, php 5+, mysql 5.1, Apache 2+. I can run my web app and it queries the database without any problems. However, when I run phpmyadmin to get into the database I get an error: "mysqld.exe has stopped working" and phpmyadmin crashes. I tried going to the command line for mysql to do a mysqldump to backup my database and it gives me an error "2013, could not connect to the server". If I restart the computer the webapp will work again. Basically, php can query the database, but if I try to get at the database through phpmyadmin, or the command prompt the mysqld.exe error occurs and blows mysql out. Any ideas what's going on here? Any ideas how to get around this to backup the db, so I can reinstall mysql?. I'm really lost where to start. I don't really know if the updates caused the problem, or if the 4 updates that can't be unistalled are really the problem. Any tips will be appreciated. thanks.

    Read the article

  • Dealing with image upload on server

    - by user1073320
    I have got a the following problem: I have got multi-step form where in one step user upload image to server and then few steps further supplies other information, when this information is invalid no data should be commited - also the image should be deleted. I was thinking about PHP session, but I've read here PHP - Store Images in SESSION data? that it is inefficient way. Every time you proceed step in the form the image is reloaded (in the session) and as somebody mentioned "You will want it to only be as big as it needs to be and you need to delete it as soon as you don't need it because large pieces of information in the session will slow down the session startup." - here i got a question: will it slow down the stratup the session of user who upload file or sessions of all users? I have to mention that I'm looking for solution that doesn't rely on operating system scripts (cron or etc) - I have no permission to run such scripts. The perfect solution for me would be: saving image on disk (for example in some folder named after session id) then after the latest step of form move this image or delete depending on form validation. If user unexpectedly destroy the session (for example closing the browser) of course the folder with image should be deleted. In nutshell I need somethig like callback to event "destroying session".

    Read the article

  • MongoDB PHP Drive install failed (Windows)

    - by terence410
    I attempted to download the php_mongo.dll from this link (http://github.com/mongodb/mongo-php-driver/downloads) and added the extension to php. However it doesn't work at all. My PHP version is: 5.2.x Windows: Windows 7 64 bit. I have also try to use pecl install mongo but it returned some error like: This DSP mongo.dsp doesn't not exist. Could somebody help?

    Read the article

< Previous Page | 557 558 559 560 561 562 563 564 565 566 567 568  | Next Page >