Search Results

Search found 4900 results on 196 pages for 'upload'.

Page 126/196 | < Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >

  • Apache conf for high trafic CMS with backend users?

    - by Annan
    I'm in the situation where a website is going to have a high number of web users and a few backend webmasters. Webmasters will upload images (+other high mem tasks) and this bumps up the memory allocation of the httpd child processes to 100-150mb. In order to stop swapping I'm currently setting MaxClients in httpd.conf to 20. However this lowers maximum simultaneous requests. Will this be a problem when the website goes live? What is the best configuration? Info: Drupal 6, PHP 5, Apache 2.2 (Prefork atm) I'm thinking about Worker MPM, two apache instances or low MaxRequestsPerChild.

    Read the article

  • Clipboard app that automatically saves to Web share folder ?

    - by cyberpine
    I currently use SnagIt and have used Windows 7 Snipping tools. These tools allow you to copy a piece of your screen the clipboard and then paste it into other applications on your desktop. They work with Outlook and it's really useful as you don't have to save the image to do a write-up or send an email. Problem is - you can't paste clipboard images into webforms. Gmail web does have image insert feature, but it only works on web images that you copied as it extracts the full url from your copy and uses that. It does not work on with images in your local clipboard. Does anybody know of an application that would allow to clip and save directly to a web folder, maybe something that replaces your clipboard with the URL of the save image location? It would be awesome if the picasa web client or evernote allowed actual clipboard pasting. Instead they ask for a file to upload.

    Read the article

  • About Web Server

    - by Chathura JAyanath
    I develop web sites. and i'm hope host my web site on my own web server. then i have ms win server 2008 computer with apache and IIS. i host my web site on iis. then it visible in local area network. now i want to open it to world. i have my own web url (www.myurl.lk). domain name provider ask about name server 1 and name server 2. i don't know how can i find my name server. can any one help me to do this? i already try this using upload web site to iis. but it's not work. it can see only local area network.

    Read the article

  • teamviewer display is slow to refresh

    - by brunoais
    I'm using teamviewer from time to time to connect my portable computer with my desktop computer when I need to check on something (personal use). One of the main things I've been noticing is that it is quite slow transfering the screen updates that keep happening. This happens more frequently when I'm moving full windows or when I open a window or when watching a small movie clip. I've tested the lowest maximum p2p speed I ever got in the last week between both and it is the desktop's upload speed of 6MB/s with the ping of 40ms (usually ~15 ms). With this I was expecting to have very sharp image and fast transfer speeds which is not what is happening. Just to clarify, both computer's individual core's CPU is not reaching its peak (25%) while teamviewer is active, so the encryption and decryption is not the cause. Why my connection between both computers so slow?

    Read the article

  • How cloudfront works?

    - by Dharmik Bhandari
    I'm planning to Implement CDN(Content Delivery Network) of Amazon which is known as CloudFront in ASP.NET MVC3 with c#. I've googled about it but little bit confuse about few things mentions below. Is it compulsory that we have to uploads all static resources to CDN Network first and then we can use or Is it manageable by Amazon to crawl site static resources which is predefine folder or directory of sites? Is Amazon automatic update its copies when we anything change in static resources or every time we have to upload updated resources to CDN network.

    Read the article

  • Find & Replace in blogger

    - by Benjol
    (I just learnt that Yahoo are shutting down Geocities, this is a bummer as I used my account there to host two icons, in the days when Blogger didn't upload gifs. Now those two icons are referenced in every one of my 600+ posts). Blogger doesn't have a "Find & Replace in all posts" function. I'm wondering about using the API to hack something together, and wondered if anyone has already tried playing with the API, if there are any tips or things to avoid. Also if there isn't a sneaky shortcut (like switching to ftp, doing the replace locally then switching back?). Not 100% programming related, please be lenient, or at least answer my question before closing it!

    Read the article

  • Posting videos to a website... with a dynamically updating interface

    - by Johnny W
    I'm currently putting together a website for my Luddite friend. He needs a blog, photos and videos... all easily updateable by him. WordPress and Blogger both have brilliant functionality to allow you to host your own blog on your own domain, but what about photos and videos? Does YouTube or Google videos or Vimeo or MSN Video or anyone offer the ability to easily insert a video navigation interface onto your own site? (Something similar to YouTube's "Channel" pages, except on your own domain?) Same goes for photos. I'm pretty sure FlickR doesn't allow you to easily upload and manage photos for viewing on your own domain. Does anyone else? My friend doesn't want to be editing any HTML and I don't want to be creating a complicated system to handle tons of different videos. There must a user-friendly solution in this day and age? Thanks for any help!

    Read the article

  • How can I join two simple home networks together using an ethernet cable?

    - by Ilia Jerebtsov
    I want to join two different home networks together like so: PC A1 PC A2 PC B1 PC B2 \ / \ / Gateway A <----- ethr. cable -----> Gateway B | | ADSL modem A ADSL modem B Both networks are of the basic residential type with identical configuration, with all PCs running Vista/7. The point is to temporarily join two apartments in a building for gaming and file sharing, and the holy grail would be: PCs on network A can access PCs on network B and vice-versa (file shares and gaming). Each network uses its own internet connection. Data between networks shouldn't take a trip through the internet (broadband upload speeds are severely capped) A network's internet access should continue working if the joining cable is disconnected with minimal configuration changes. How closely can this be achieved?

    Read the article

  • Deny directory browsing in a Proftpd / Ubuntu Installation

    - by skylarking
    I used this guide to set up a Proftpd installation an Ubuntu 8.04 server... Works well, but the generic user ( userftp ) can run ls and is able to change to any Directory and browse freely on the server ..from the root / and upwards.. I added this line to etc/shells /bin/false in hopes that that would prevent this ... I really only want the userftp account to be able to upload to the generic /home/FTP-Shared directory, and be able to do nothing else on the server. How is this accomplished ... This is a headless Ubuntu box..and I am using CLI only .. no GUI admin tools

    Read the article

  • jQuery not executed on page load

    - by Arild Sandberg
    I'm building an ajax upload with an editing function (rotate, zoom and crop), and I'm using guillotine by matiasgagliano (https://github.com/matiasgagliano/guillotine) for this. My problem is that after upload the user get redirected to the editing page through ajax, but when landing on that page I always have to refresh the page in browser for the image to load. I've tried auto-reloading, both through js and php, but that doesn't help, neither does adding a button to load the same url again. Only refresh from browser button (tested in several browsers) works. I've tried implementing jquery.turbolinks, but that stopped guillotine functions from working. I'm loading the guillotine.js in head section after jQuery, and have the function in bottom before body tag. Any tip or help would be appreciated. Thx Here is some of the code: HTML: <div class='frame'> <img id="id_picture" src="identifications/<?php echo $id_url; ?>" alt="id" /> </div> <div id='controls'> <a href='javascript:void(0)' id='rotate_left' title='<?php echo $word_row[434]; ?>'><i class='fa fa-rotate-left'></i></a> <a href='javascript:void(0)' id='zoom_out' title='<?php echo $word_row[436]; ?>'><i class='fa fa-search-minus'></i></a> <a href='javascript:void(0)' id='fit' title='<?php echo $word_row[438]; ?>'><i class='fa fa-arrows-alt'></i></a> <a href='javascript:void(0)' id='zoom_in' title='<?php echo $word_row[437]; ?>'><i class='fa fa-search-plus'></i></a> <a href='javascript:void(0)' id='rotate_right' title='<?php echo $word_row[435]; ?>'><i class='fa fa-rotate-right'></i></a> </div> Js: <script type='text/javascript'> jQuery(function() { var picture = $('#id_picture'); picture.guillotine({ width: 240, height: 180 }); picture.on('load', function(){ // Initialize plugin (with custom event) picture.guillotine({eventOnChange: 'guillotinechange'}); // Display inital data var data = picture.guillotine('getData'); for(var key in data) { $('#'+key).html(data[key]); } // Bind button actions $('#rotate_left').click(function(){ picture.guillotine('rotateLeft'); }); $('#rotate_right').click(function(){ picture.guillotine('rotateRight'); }); $('#fit').click(function(){ picture.guillotine('fit'); }); $('#zoom_in').click(function(){ picture.guillotine('zoomIn'); }); $('#zoom_out').click(function(){ picture.guillotine('zoomOut'); }); $('#process').click(function(){ $.ajax({ type: "POST", url: "scripts/process_id.php?id=<?php echo $emp_id; ?>&user=<?php echo $user; ?>", data: data, cache: false, success: function(html) { window.location = "<?php echo $finish_url; ?>"; } }); }); // Update data on change picture.on('guillotinechange', function(ev, data, action) { data.scale = parseFloat(data.scale.toFixed(4)); for(var k in data) { $('#'+k).html(data[k]); } }); }); }); </script>

    Read the article

  • Need solutions in sharing a 3Mb/768Kbps DSL line to 60+ users and faster bandwidth

    - by elistp
    Two parts. Part 1: We currently have 2 DSL Lines with 3Mb/768Kbps speeds load balanced for 60+ users. Accessing the Internet is borderline unusable. The simple solution would be to get a faster DSL Line but the highest DSL package is 6Mb/768Kbps, has quite the price jump, and doesn't do anything to help with upload speeds. I'm looking for free or extremely low cost solutions (web cache, traffic shaping, bandwidth controls, etc) to help with making Internet access more bearable until the next funding year. Can anyone give any advice? Part 2: We're looking into a 4.5Mb bonded T1 in the next funding year which is of course significantly more expensive than 2 DSL lines. Are bonded T1s our only hope for faster speeds? Are there any better alternatives?

    Read the article

  • What is a good solution for an intranet video portal (YouTube-like) site?

    - by Ken Pespisa
    I would like an easy-to-setup site to handle videos to be viewed internally by my company. YouTube is essentially the perfect solution except for its being public. I'm looking for a place where a few people can upload videos, and the system will return a page where they can watch that video in a browser. I figure this would involve a dedicated Web server to run the Web application and process the videos. I've searched and I don't think such a system exists, but I perhaps there's one out there in its infancy that doesn't rank high on Google yet. Essentially the site I'm looking for is what MediaWiki is to Wikis, or what StackExchange is to Q&A sites, but for videos. Thanks in advance!

    Read the article

  • Deleting files using .NET that were migrated from win2k3

    - by Andrew Duncan
    We recently migrated an ASP.NET website from Windows 2003 to Windows 2008 R2, by zipping up all the files and extracting them to the new site. Since migrating the web application is still able to upload and delete files (that are new), however, it's unable to delete files that were copied from the original Win 2k3 app. We're guessing it's a permissions problem because the error is: Access to the path 'E:.......PATH.....' is denied. We've been trying to match the permissions of a newly uploaded file to that of a migrated files. Newly uploaded files seem to get the APP POOL user as a permission and the OWNER. However, the original files didn't have this. Any help that anyone can be would be fantastic. Thanks,

    Read the article

  • how can i get list of created and deleted files on the server?

    - by max
    i have a image sharing website , users log and upload image last night i've lost about 30 newly-consecutive uploaded images ... i mean they have been uploaded ... apparently ... they are in the database but the actual image on the server is gone ! error log doesn't show anything ... so i thought my best option is to check list of created and deleted files ... if there is any ! is there a log file for created and deleted files on the server ? i'm using directadmin

    Read the article

  • Common filesystem for servers behind a rackspace load balancer

    - by thanos panousis
    Our PHP application consists of a single web server that will receive files from clients and perform a CPU-intensive analysis on them. Right now, analysis of a single user upload can take 3sec to conclude and take 100% CPU. This makes our system capacity amount to 1/3 requests per second. My team's requirement is to increase capacity without a lot of code reengineering. A possible solution would be to set up a load balancer in front of multiple servers running the same app, connecting to a common DB. The problem is that the analysis outputs files on disk. A load balancer would increase capacity, but then files won't be available between servers so consequent client requests may fail. We are hosted on Rackspace, is there a way to configure some sort of "common" storage for all servers, without having to rewrite our file persistance code? Current code relies on simple fopens etc. What are our options?

    Read the article

  • Is it possible to use bittorrent for a fileserver

    - by sris
    I would like to set up a file server that is searchable, preferable via the web. I'm wondering if it would be possible to achieve this using the bittorrent protocol and have a single client sharing every single torrent on the server. I guess I could use some available tracker solution for the webinterface or write one myself. My concerns are the if there are any limits to the number of torrents a single client can share since this may potentially be 10k torrents. The number of downloading clients is very small, only myself and my relatives. The idea is to have a single place to host everything from vacation photos to musical creations. Is there any other options for this kind of file server. It should also be easy to upload files to the server.

    Read the article

  • I cannot connect to database from Drupal

    - by Patrick
    hi, I've uploaded my drupal website (and related database) to my new server. The database info is: host: localhost user: user pass: pass databaseName = database_name I've set the following line in settings.php file: $db_url = 'mysqli://user:password@localhost/database_name'; but what I get is this: If you are the maintainer of this site, please check your database settings in the settings.php file and ensure that your hosting provider's database server is running. For more help, see the handbook, or contact your hosting provider. I guess the database is running, it always run and I can access with phpmyadmin so I think the problem is not there. The database and website files upload have also been succesfull.. so I dunno what to do to fix this issue. It is mysql on IIS Server thanks

    Read the article

  • Drupal 7 on Windows - File Module Problems

    - by TimothyP
    Installed Drupal 7 using the Web Platform installer on Windows 2008 For some reason, the file module, when you upload a file, uses the first few letters of the filename as the unique key to store in the database, which of course causes problems very fast. I'm wondering does anybody have a workaround for this? An AJAX HTTP request terminated abnormally. Debugging information follows. Path: /file/ajax/field_file/und/0/form-EBMatHzV5cZXcWvXJtdADSdyw7Id9-GIpFM_NCJg_a4 StatusText: n/a ResponseText: Error message PDOException: SQLSTATE[23000]: [Microsoft][SQL Server Native Client 10.0][SQL Server]Cannot insert duplicate key row in object 'dbo.file_managed' with unique index 'uri_unique'. in drupal_write_record() (line 6776 of ..........\includes\common.inc). Error The website encountered an unexpected error. Please try again later. ReadyState: undefined (PS: I hope superuser is the right place to ask)

    Read the article

  • How can I set up an FTP user with a home directory inside another user's home folder?

    - by simon180
    Hi I have an Ubuntu (Hardy) server which I am using to host multiple websites. All of the sites are stored in subfolders of a public_html folder for my main login to the server and accessed via a single SSH account. I now have a website user who wants FTP (or similar) access to enable them to upload various files etc to the directory where their website is situated, however I still need the SSH account to have access to this directory as I may need to make changes using my master account. Basically I want to create an FTP account (I have VSFTPD installed) for a user with the home directory inside my own user account but they should only be able to read/write to this folder or its subfolders but not go further up the directory tree. How can I achieve this? Thanks

    Read the article

  • How do I know if 'hg clone' is doing the work remotely?

    - by jjfine
    I've got a very simple windows install of Mercurial on my machine. The 'central' repository is located at //mymachine/hg-repos/central. I want remote (VPN) users to be able to create clones of this repository in the hg-repos directory because it gets daily backups. I have given these users full control of the hg-repos directory. My question is this: If I'm on a remote machine, and I run the command: hg clone //mymachine/hg-repos/central //mymachine/hg-repos/central-copy ...is the remote machine doing most of the work? I don't want the client to have to download all of the central repository and then upload it all back because people are going to be using this from across the country. But I suspect this is what's happening here since it works so easily.

    Read the article

  • Internet printing : print to iOS gateway -> AirPrint / something -> Laser printers

    - by user75129
    I've done a bit of research on the Internet, and it looks like I'm on a dead end. My goal is to minimize cost, and be able to send and print documents AUTOMATICALLY (may be 10 or 20 pages per day) to a laser printer in a remote office. The preliminary method is to use: iOS 5.1.1 (JB'ed) with 3G connection, HP (or other brands) printer with AirPrint, iCloud's Documents and may be write some launchd scripts to monitor any new documents in iCloud. May be with other software. I am not sure yet. By using the cloud, I can upload new docs to the cloud anywhere in my city, and the iOS will be able to see them within a reasonable amount of time, then print it. But it seems this combo is not workable. Anyone got any advice on how to make this set up work, or propose other alternatives that requires NO PC or Mac? Currently I have a 3GS with 3G connectivity spare. Need to buy a new printer though.

    Read the article

  • Why is my NSURLConnection so slow?

    - by Bama91
    I have setup an NSURLConnection using the guidelines in Using NSURLConnection from the Mac OS X Reference Library, creating an NSMutableURLRequest as POST to a PHP script with a custom body to upload 20 MB of data (see code below). Note that the post body is what it is (line breaks and all) to exactly match an existing desktop implementation. When I run this in the iPhone simulator, the post is successful, but takes an order of magnitude longer than if I run the equivalent code locally on my Mac in C++ (20 minutes vs. 20 seconds, respectively). Any ideas why the difference is so dramatic? I understand that the simulator will give different results than the actual device, but I would expect at least similar results. Thanks const NSUInteger kDataSizePOST = 20971520; const NSString* kServerBDC = @"WWW.SOMEURL.COM"; const NSString* kUploadURL = @"http://WWW.SOMEURL.COM/php/test/upload.php"; const NSString* kUploadFilename = @"test.data"; const NSString* kUsername = @"testuser"; const NSString* kHostname = @"testhost"; srandom(time(NULL)); NSMutableData *uniqueData = [[NSMutableData alloc] initWithCapacity:kDataSizePOST]; for (unsigned int i = 0 ; i < kDataSizePOST ; ++i) { u_int32_t randomByte = ((random() % 95) + 32); [uniqueData appendBytes:(void*)&randomByte length:1]; } NSMutableURLRequest *request = [[NSMutableURLRequest alloc] init]; [request setURL:[NSURL URLWithString:kUploadURL]]; [request setHTTPMethod:@"POST"]; NSString *boundary = @"aBcDd"; NSString *contentType = [NSString stringWithFormat:@"multipart/form-data; boundary=%@",boundary]; [request addValue:contentType forHTTPHeaderField: @"Content-Type"]; NSMutableData *postbody = [NSMutableData data]; [postbody appendData:[[NSString stringWithFormat:@"--%@\nContent-Size:%d",boundary,[uniqueData length]] dataUsingEncoding:NSUTF8StringEncoding]]; [postbody appendData:[[NSString stringWithFormat:@"\nContent-Disposition: form-data; name=test; filename=%@", kUploadFilename] dataUsingEncoding:NSUTF8StringEncoding]]; [postbody appendData:[[NSString stringWithString:@";\nContent-Type: multipart/mixed;\n\r\n"] dataUsingEncoding:NSUTF8StringEncoding]]; [postbody appendData:[NSData dataWithData:uniqueData]]; [postbody appendData:[[NSString stringWithFormat:@"--%@\nContent-Size:%d",boundary,[kUsername length]] dataUsingEncoding:NSUTF8StringEncoding]]; [postbody appendData:[[NSString stringWithFormat:@"\nContent-Disposition: inline; name=Username;\n\r\n%@",kUsername] dataUsingEncoding:NSUTF8StringEncoding]]; [postbody appendData:[[NSString stringWithFormat:@"--%@\nContent-Size:%d",boundary,[kHostname length]] dataUsingEncoding:NSUTF8StringEncoding]]; [postbody appendData:[[NSString stringWithFormat:@"\nContent-Disposition: inline; name=Hostname;\n\r\n%@",kHostname] dataUsingEncoding:NSUTF8StringEncoding]]; [postbody appendData:[[NSString stringWithFormat:@"\n--%@--",boundary] dataUsingEncoding:NSUTF8StringEncoding]]; [request setHTTPBody:postbody]; NSURLConnection *theConnection = [[NSURLConnection alloc] initWithRequest:request delegate:self]; if (theConnection) { _receivedData = [[NSMutableData data] retain]; } else { // Inform the user that the connection failed. } [request release]; [uniqueData release];

    Read the article

  • Creating user accounts in Amazon EC2

    - by Tvanover
    I am putting together a test environment using Amazon's EC2 for me and some friends to collaborate on a project. I am not a server guy but I do know my way around a bash prompt and have done some work on ubuntu before. I am using Amazon Linux AMI i386 EBS and have gotten apache and php running. Now I need to create the user accounts my friends and I will use to upload files (sftp) and work on the project (ssh). How should I go about this? Should I just use adduser and configure it like normal? Or should I use the AWS IAM groups?

    Read the article

  • Odd FTP ASCII/BINARY transfer behavior after migrating to a new server

    - by Incognita Mundi
    I recently got a dedicated server and after migration to the new machine I started noticing problems with file transfers. My FTP client, despite being set to auto, keeps uploading php files in binary mode and the content of those files is messed up. Since I mostly upload files of different kinds it would be annoying to change from binary to ASCII every single time, beside, I never had such problems. What could be the cause of this behavior? My dedicated server runs CENTOS 6.4 and the ftp server is Pure-FTPd. I tried different FTP clients and they all have the same problem so I assume is soem misconfiguration on the server side. Thanks

    Read the article

  • What kind of proxy acl rules should be applied?

    - by user42891
    I try to block sites in squid based on this article. Assuming you would want to block access to Yahoo (e.g http://www.yahoo.co.jp, http://www.yahoo.com, http://www.yahoo.co.in), you would ideally want to block all of the above URLs, if I use a regular expression and try to search something called yahoo it seems to get blocked. We are just interested in applying rules which would be most commonly used across all companies (e.g. social networking sites like facebook, orkut), porn sites (e.g. sex), gaming sites (games), movie & song download sites, and sites where they can upload data (e.g. rapidshare) What would be the common set of effective rules in achieving the above?

    Read the article

< Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >