Search Results

Search found 975 results on 39 pages for 'uploads'.

Page 12/39 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • Can this code cause a "500" internal server error ?

    - by Scott B
    A few of my customers are reporting that they are getting "500" Internal Server errors lately. I believe it might be caused by various plugins they are using but each time, the hosting company (multiple hosts) are saying that the htaccess file had to be replaced to fix the issue. I'm submitting the code below from my custom theme because its the only place where I trigger an htaccess write. And I want to be sure that there are no problems here that could cause an issue that might contribute to the 500 errors... if (file_exists(ABSPATH.'/wp-admin/includes/taxonomy.php')) { require_once(ABSPATH.'/wp-admin/includes/taxonomy.php'); if(get_option('permalink_structure') !== "/%postname%/" || get_option('mycustomtheme_permalinks') !=="/%postname%/") { $mycustomtheme_permalinks = get_option('mycustomtheme_permalinks'); require_once(ABSPATH . '/wp-admin/includes/misc.php'); require_once(ABSPATH . '/wp-admin/includes/file.php'); global $wp_rewrite; $wp_rewrite->set_permalink_structure($mycustomtheme_permalinks); $wp_rewrite->flush_rules(); } if(!get_cat_ID('topMenu')){wp_create_category('topMenu');} if(!get_cat_ID('hidden')){wp_create_category('hidden');} if(!get_cat_ID('noads')){wp_create_category('noads');} } if (!is_dir(ABSPATH.'wp-content/uploads')) { mkdir(ABSPATH.'wp-content/uploads'); }

    Read the article

  • Inserting javascript with jQuery .html

    - by Andrew Appleby
    I'm experiencing an issue with a website I'm working on, where I originally believed I could simply replace $("main").html('THIS') which was originally code for a flash object, with a new and improved HTML5/Javascript version. The original code: $("#main").html('<div style="height: 100%; width: 100%; overflow: hidden;"><object width="100%" height="100%" codebase="http://fpdownload.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=8,0,0,0" classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000">\ <param value="images/uploads/'+image_id+'.swf" name="movie">\ <param value="true" name="allowFullScreen">\ <param value="#737373" name="bgcolor">\ <param value="" name="FlashVars">\ <embed width="100%" height="100%" flashvars="" bgcolor="#737373" allowfullscreen="true" src="images/uploads/'+image_id+'.swf" type="application/x-shockwave-flash" pluginspage="http://www.macromedia.com/go/getflashplayer">\ </object></div>\ '); }); And my (failed) attempt at inserting my new code: $("#main").html('<script type="text/javascript">pano=new pano2vrPlayer("container");skin=new pano2vrSkin(pano);pano.readConfigUrl("xml/tablet_'+image_id+'.xml");hideUrlBar();</script>'); It doesn't even work when I just put , so I know it's gotta be the javascript itself. I've looked at the solutions out there, but I can't make sense in how to properly implement them here in the most efficient way. Your help is much appreciated, thanks!

    Read the article

  • Azure batch operations delete several blobs and tables

    - by reft
    I have a function that deletes every table & blob that belongs to the affected user. CloudTable uploadTable = CloudStorageServices.GetCloudUploadsTable(); TableQuery<UploadEntity> uploadQuery = uploadTable.CreateQuery<UploadEntity>(); List<UploadEntity> uploadEntity = (from e in uploadTable.ExecuteQuery(uploadQuery) where e.PartitionKey == "uploads" && e.UserName == User.Idendity.Name select e).ToList(); foreach (UploadEntity uploadTableItem in uploadEntity) { //Delete table TableOperation retrieveOperationUploads = TableOperation.Retrieve<UploadEntity>("uploads", uploadTableItem.RowKey); TableResult retrievedResultUploads = uploadTable.Execute(retrieveOperationUploads); UploadEntity deleteEntityUploads = (UploadEntity)retrievedResultUploads.Result; TableOperation deleteOperationUploads = TableOperation.Delete(deleteEntityUploads); uploadTable.Execute(deleteOperationUploads); //Delete blob CloudBlobContainer blobContainer = CloudStorageServices.GetCloudBlobsContainer(); CloudBlockBlob blob = blobContainer.GetBlockBlobReference(uploadTableItem.BlobName); blob.Delete(); } Each table got its own blob, so if the list contains 3 uploadentities, the 3 table and the 3 blobs will be deleted. I heard you can use table batch operations for reduce cost and load. I tried it, but failed miserable. Anyone intrested in helping me:)? Im guessing tablebatch operations are for tables only, so its a no go for blobs, right? How would you add tablebatchoperations for this code? Do you see any other improvements that can be done? Thanks!

    Read the article

  • Joomla ImageBrowser Lightbox not working

    - by jmorhardt
    I've been using an image browser for Joomla called ImageBrowser. It's great - easy uploads, easy for clients to use, even handles zip file uploads. I have Joomla v. 1.5.15 installed and the newest version of this plugin as well on 3 or 4 of our sites with no fuss or issues. Recently, the ImageBrowser on one of our sites started acting strangely. It was as if the Lightbox effect that we should have disappeared completely. I compared the settings for the site to another and found they were identical. I couldn't find a solution in the documentation or forums. Here's the URL to look at: http://neda.us/photos?view=gallery&folder=BSU+12-5-09. You can compare that to another of our sites with the same plugin and settings at South Oak Floors dot COM. You should be able to click on a thumbnail and get a full size view of the image in a Lightbox. Any help much needed and much appreciated.

    Read the article

  • Google Maps Controls panel size is displayed wrong

    - by Andrea Giachetto
    I have a weird problem with Google Maps Control. I've tested in a blank page my custom maps with custom markers and everything seems to be ok, also with the control panel. When I tried to import all my code in the page I'm working with ( I use a full screen fluid grid system ) the control panel is displayed with strange size. I tried everything for disable/enable the ui of the Google Maps but the problem remain. The code of my maps are exactly the same, both in my blank page and in my site, but in the site the ui control panel is displayed very strange. Here's the code: <div id="map_canvas2" style="height: 580px; width: 100%;"></div> <script> var image = 'path/to/your/image.png'; var mapOptions = { zoom: 17, center: new google.maps.LatLng(45.499290, 12.621510), mapTypeId: google.maps.MapTypeId.ROADMAP, scrollwheel: false } var map = new google.maps.Map(document.getElementById('map_canvas2'), mapOptions); var myPos = new google.maps.LatLng(45.499290,12.621510); var myMarker = new google.maps.Marker({position: myPos, map: map, icon: 'http://www.factory42.it/jtaca/wordpress/wp-content/uploads/2014/06/pin-map.png' }); </script> </div> Here's an img: http://www.factory42.it/jtaca/wordpress/wp-content/uploads/2014/06/img-maps.png

    Read the article

  • Extra space in Opera

    - by HiveHicks
    I'm trying to put a progress bar inside td of my table. Here's the code: <td style="width: 150px; max-height: 16px;"> <div style="height: 16px; max-height: 16px; overflow: hidden; border: 1px solid #80C622;"> <div style="height: 16px; width: 10%; background-color: #bbea7d;"></div> <div style="margin-top: -16px; text-align: center;"> 1470/14166 </div> </div> </td> Chrome, Firefox, Safari and (!) IE displays it correctly, whereas Opera extends the row so there is some extra space above. Here's how it's supposed to look like: http://ipicture.ru/uploads/100616/16el6B3lB1.png Here's how it looks in Opera: http://ipicture.ru/uploads/100616/fE4Ad63N1l.png

    Read the article

  • Accessing to Request object will lead to ReadEntityBody to return 0 (in a HttpHandler class)

    - by EBAG
    I created a httpHandler that successfully implements IHttpHandler for handling file uploads. It works perfectly fine. You send the file with the form, the class receives it and will save it to hard disk. It reads chunks of file with ReadEntityBody function of HttpWorkerRequest class. Here is the situation i'm faced with.If at any stage before trying to read the file with ReadEntityBody, I try to access Request object (even Request.InputStream.Length!) ReadEntityBody would return 0 means it won't read from file stream. After further testing I found out the reason behind it. Accessing to Context.Current.Request object will trigger some sort of functionality that will cause asp.net to handle file uploads at that moment by it's own! I believe this is a bug. for example exactly after this line of code, asp.net will upload the file completely, and so there will be no stream for ReadEntityBody to read from later. int FileSize = context.Request.InputStream.Length; Can anybody tell how to stop this?

    Read the article

  • A stupid question about wordpress and php

    - by bubdada
    It may seem stupid question, but i've a serious problem... if you could check out orcik.net the thumbnail images does not appear. I figured out the reason but I don't know how to solve.. http://orcik.net/projects/thumb/orcikthumb.php?src=http://orcik.net/wp-content/uploads/2010/05/mac-safari-search-cache.png If you go to the above link you will get page not found error. However, if you go to the link below you'll get the thumbnail version of the image... http://orcik.net/projects/thumb/orcikthumb.php?src=/wp-content/uploads/2010/05/mac-safari-search-cache.png I'm using this piece of code on wordpress and the line appears like <a href="<?php the_permalink() ?>" rel="bookmark"> <img src="<?php bloginfo('template_directory'); ?>/includes/orcikthumb.php?src=<?php get_thumbnail($post->ID, 'full'); ?>&amp;h=<?php echo get_theme_mod($height); ?>&amp;w=<?php echo get_theme_mod($width); ?>&amp;zc=1" alt="<?php the_title(); ?>" /> </a> Thus, I believe I can't change the directory of image. But I could not figure out why I am getting page not found error. Is that might be CHMOD'es??? or something else?? Thanks

    Read the article

  • How can I make this jQuery plugin chainable after all image load events have completed?

    - by BumbleB2na
    [UPDATE] Solution I decided on: Decided that passing in a callback to the plugin will take care of firing an event once all images have completed loading. Chaining is also still possible. Updated Fiddle I am building a chainable jQuery plugin that can load images dynamically. (View the following code as a JSFiddle) Html: <img data-img-src="http://www.lubasf.com/blog/wp-content/uploads/2009/03/gnome.jpg" style="display: none" /> <img data-img-src="http://buffered.io/uploads/2008/10/gnome.jpg" style="display: none" /> Instead of adding in a src attribute, I give these images a data-img-src attribute. My plugin uses the value of that to fill the src. Also, these images are hidden to begin with. jQuery plugin: (function(jQuery) { jQuery.fn.loadImages = function() { var numToLoad = jQuery(this).length; var numLoaded = 0; jQuery(this).each(function() { if(jQuery(this).attr('src') == undefined) { return jQuery(this).load(function() { numLoaded++; if(numToLoad == numLoaded) return this; // attempt at making this plugin // chainable, after all .load() // events have completed. }).attr('src', jQuery(this).attr('data-img-src')); } else { numLoaded++; if(numToLoad == numLoaded) return this; // attempt at making this plugin // chainable, after all .load() // events have completed. } }); // this works if uncommented, but returns before all .load() events have completed //return this; }; })(jQuery); // I want to chain a .fadeIn() after all images have completed loading $('img[data-img-src]').loadImages().fadeIn(); Is there a way to make this plugin chainable, and have my fadeIn() happen after all images have loaded?

    Read the article

  • Image error with wordpress and php

    - by bubdada
    It may seem stupid question, but i've a serious problem... if you could check out orcik.net the thumbnail images does not appear. I figured out the reason but I don't know how to solve.. http://orcik.net/projects/thumb/orcikthumb.php?src=http://orcik.net/wp-content/uploads/2010/05/mac-safari-search-cache.png If you go to the above link you will get page not found error. However, if you go to the link below you'll get the thumbnail version of the image... http://orcik.net/projects/thumb/orcikthumb.php?src=/wp-content/uploads/2010/05/mac-safari-search-cache.png I'm using this piece of code on wordpress and the line appears like <a href="<?php the_permalink() ?>" rel="bookmark"> <img src="<?php bloginfo('template_directory'); ?>/includes/orcikthumb.php?src=<?php get_thumbnail($post->ID, 'full'); ?>&amp;h=<?php echo get_theme_mod($height); ?>&amp;w=<?php echo get_theme_mod($width); ?>&amp;zc=1" alt="<?php the_title(); ?>" /> </a> Thus, I believe I can't change the directory of image. But I could not figure out why I am getting page not found error. Is that might be CHMOD'es??? or something else?? Thanks

    Read the article

  • Handler for (null) returned invalid result code 70007 / causing error 500?

    - by Sherif Buzz
    I am getting these errors on some pages in my site (php/apache/linux/mysql vps) on intervals and can't seem to find any reproducible scenario : Handler for (null) returned invalid result code 70007 or Handler for (null) returned invalid result code 70014 It occurs mainly on pages where file (image) uploads are done. It then causes a 500 error. Google hasn't returned anything conclusive, has anybody come across these errors ?

    Read the article

  • uploading via http post (multipart/form-data) silently fails with big files

    - by matteo
    When uploading multipart/form-data forms via a http post request to my apache web server, very big files (i.e. 30MB) are silently discarded. On the server side all looks as if the attached file was received with 0 bytes size. On the client side all looks like it had been uploaded succesfully (it takes the expected long time to upload and the browser gives no error message). On the server, nothing is logged into the error log. An entry is logged into the access log as if everything was ok (a post request and a 200 ok response). These uploads are being posted to a php script. In the php script, If I print_r $_FILES, I see the following information for the relevant file: [file5] => Array ( [name] => MOV023.3gp [type] => video/3gpp [tmp_name] => /tmp/phpgOdvYQ [error] => 0 [size] => 0 ) Note both [error] = 0 (which should mean no error) and [size] = 0 (as if the file was empty). My php script runs fine and receives all the rest of the data except these files. move_uploaded_file succeeds on these files and actually copies them as 0byte files. I've already changed the php directives max_upload_size to 50M and post_max_size to 200M, so neither the single file nor the request exceed any size limit. max_execution_time is not relevant, because the time to transfer the data does not count; and I've increased max_input_time to 1000 seconds, though this shouldn't be necessary since this is the time taken to parse the input data, not the time taken to upload it. Is there any apache configuration, prior to php, that could be causing these files to be discarded even prior to php execution? Some limit in size or in upload time? I've read about a default 300 seconds timeout limit, but this should apply to the time the connection is idle, not the time it takes while actually transferring data, right? Needless to say, uploads with all exactly identical conditions (including file format, client and everything) except smaller file size, work seamlessly, so the issue is clearly related to the file or request size, or to the time it takes to send it.

    Read the article

  • FTP Server vsftpd change ftp:nogroup

    - by pygorex1
    I'm running vsftpd using the Debian Lenny package. ftp:nogroup is the user/group that uploads files and owns uploaded files. However, a problem is arising - another process is also writing files to the FTP directory as myprocess:mygroup with restrictive file permissions, preventing vsftpd from overwriting the myprocess authored files. Is it possible to tell vsftpd to use a different user/group for uploading files? (preferably as myprocess:mygroup or ftp:mygroup)

    Read the article

  • Software to run a mozy/carbonite "online backup" server?

    - by chris
    I am interested in creating a server that is similar to the mozy/carbonite, but I want to run it myself. So I will need server software of some sort (we run ubuntu), and client software that runs in the background and uploads data from specified folders as it is changed. Basically i want to run the equivalent of mozy or carbonite for an internal corporate network. Clients are all WinXP

    Read the article

  • setting upload/space limit for FTP client

    - by tombull89
    As part of my web hosting plan i've also got a few domain hosted for family and friends. I have my own FTP account with full access and I would like to give FTP to a web designer and a user. Is there a way for restricing uploads larger than a certian size, or setting a size limit on the folder that the FTP access has access to? I'd rather not let them upload files ridiciously big. Thanks.

    Read the article

  • google search engine

    - by kourosh
    I am working on a google box, something like this, http://mytwentyfive.com/blog/wp-content/uploads/byme/Google%20Search%20Appliances.jpg I am pointing the crawler to a folder where there are html files. before the crawler was crawling the files and indexing them but right now it finds the pattern or the folder but not following any html files within the folder. I have tried everything I could and know but, can't think of anything else. Can someone help? thanks

    Read the article

  • Scaling a video processing application on EC2?

    - by Stpn
    I am approaching the need to scale a video-processign application that runs on EC2. So far the setup is one machine: Backbonejs frontend Rails 3.2 Postgresql Resque + S3 for storage The flow of the app is as follows: 1) Request from frontend. Upload a video. 2) Storing video 3) Quering external APIs. 4) Processing / encoding videos. 5) Post to frontend. I can separate the backend and frontend without any problems, but when it comes to distributing the backend between several servers I am a bit puzzled. I can probably come up with a temporary solution (like just duplicating apps making several instances), but since I don't really have expertise in backend system administration, there can be some fundamental mistakes.. Also I would rather have something that is scalable. I wonder if anyone can give some feedback on the following plan: A) Frontend machine. Just frontend, talks to backend via REST Api of sorts. B) Backend server (BS), main database. Gets request from 1), posts to 2) saves uploads to 3) C) S3 storage. D) Server for quering APIs. Basically just a Resque workers, that post info back to 2) E) Server for video encoding. Processes videos uploaded on 3) and uploads them back. So I will have: A)frontend \ \ B)MAIN_APP/DB ----- C)S3 Storage (Files) / \ / / \ / D)ExternalAPI_queries E)Video_Processing (redundant DB) (redundant DB) All this will supposedly talk to each other via HTTP requests. My reason for this is that Video Processing part is really the most resource-intensive and I would just run barebones application that accepts requests and starts processing them. Questions: 1) In this setup I will have the main database at B) and all other servers will communicate with it via HTTP requests (and store duplicates of databases also I guess..for safety reasons). Is it the right approach or should I have 1 database that everyone connects to (how then?) 2) Is it a good idea to separate API queries from Video Processing part? Logically they are very close (processing is determined by the result of API queries), but resource-wise Video Processing is waaay more intensive. 3) what should I use to distribute calls between backend apps based on load?

    Read the article

  • flat-rate backup service for Windows Server

    - by Colin Pickard
    Hi, Does anyone know if there is a decent flat-rate backup service which supports Windows Server? I've investigated the following: Backblaze - no WS support, sales say they have a "no server" policy JungleDisk - not flat rate Mozy - no WS on regular edition, no flat-rate on Pro edition Dropbox - no flat rate Carbonite - technically flat rate, but throttles uploads to modem speeds EDIT: Very similar question: Is there a decent flat-rate online backup solution for Linux machines?

    Read the article

  • My internet speed became slow at night

    - by FrozenKing
    My internet plan is 512kbps unlimited and I get speed of average 64kbps but at night I used to get speed of 112kbps ..but recently my speed got normal like day time ...as per my view usually at night their is less traffic so I should get good speed like before ... Due to good speed I download and upload at night and my average download+upload per month is 60gb or 70gb... Is it that my ISP people putting restriction on my download and uploads.. I am confused.

    Read the article

  • Site getting blocked on corporate networks

    - by ajay
    Hi there, Our site is getting blocked in corporate networks across various companies, but not all. Would appreciate help on what could be the reasons for this and how we can resolve this? Attached are a couple of snapshots that customers have sent us from the companies where the site is getting blocked. http://www.freeimagehosting.net/uploads/2ca59a8460.jpg

    Read the article

  • How can I read out internal pdf creation/modified date with Windows PowerShell?

    - by Martin
    PDF files seem to have a separate set of file properties which contain (among others) a creation date and a modified date (see screenshot here: http://ventajamarketing.com/writingblog/wp-content/uploads/2012/02/Acrobat-Document-Properties1-300x297.png). Those date obviously can differ from the creation and modified date shown in the file system (Windows Explorer). How can I access the date information in the PDF file and read it out in Windows 7 with Windows PowerShell (or maybe another method)?

    Read the article

  • htaccess rewrite?

    - by flyenig
    i have a script that uploads images, creates a hash for it, creates 3 directories, and stores the image to imgs/f3s/v5g/234/536_f3sv5g2344270fd093ee8a9bf8de3de32dad.jpg (the “536_” is the user id) so im trying to turn imgs/f3s/v5g/234/536_f3sv5g2344270fd093ee8a9bf8de3de32dad.jpg into user_pics/536/536_f3sv5g2344270fd093ee8a9bf8de3de32dad.jpg how can i do that? i want that if someone wants to view the photo, they see the new directory in the url, not the one with 3 sub directories.

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >