Search Results

Search found 12625 results on 505 pages for 'ajax upload'.

Page 411/505 | < Previous Page | 407 408 409 410 411 412 413 414 415 416 417 418  | Next Page >

  • web.config file changings guide

    - by Student
    Hi experts how are you all? i am student, and learning asp.net c# visual studio 2010 with using sql server 2005. I have developed a website which has database. I developed this website with self studies taking help from internet. the website is completed and working perfectly in my computer. I have hosting server and domain name registered already. the problem is when I upload my website it doesn't work there the following error displays: Server Error in '/' Application. Configuration Error Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: Unrecognized attribute 'targetFramework'. Note that attribute names are case-sensitive. Source Error: Line 11: <system.web> Line 12: <customErrors mode="Off" /> Line 13: <compilation debug="false" targetFramework="4.0"/> Line 14: </system.web> Line 15: </configuration> Source File: C:\Inetpub\vhosts\urdureport.com\httpdocs\web.config Line: 13 Version Information: Microsoft .NET Framework Version:2.0.50727.5472; ASP.NET Version:2.0.50727.5474 I don't know what should I do to get it work on hosting server please help me in this regard that what should I do with this. Thank you in advance

    Read the article

  • How to download Vim script on the command-line?

    - by HaiYuan Zhang
    Whenever I want to install a new Vim script on the Linux server I'm working on, my typical workflow is as the following: surf the plugin's homepage in Vim online using FireXXXX download a right version of the plugin to my laptop by click some highlighted link upload the downloaded plugin from my laptop to Linux server using WinSCP which is really inconvenient. I don't know what is the magic behind this: I mean for the same hyperlink I click it in web browser. I can let you download it but use Wget plus the hyperlink in Linux command-line will end up with nothing but an error indication. Hyperlink in the web browser. Otherwise I can get the link in web browser and then use Wget or some similar tool to actually do the downloding. I try new cool Vim scripts quite ofte , so you can imagine my dismay when I have to repeat the tedious action all the time. What are some tips which can let me download the Vim scripts in a more "professional" way? Post edit: My problem is not find a tool like Wget or cURL. The problem I met is quite specific; to use these tools to download a Vim script. Let's take http://www.vim.org/scripts/script.php?script_id=30 as an example. It's the normal place where one can get the script, at least for me. But I can't find an working URL from this page that can feed to Wget.

    Read the article

  • Java: very slow tomcat and too big war file

    - by NaN
    I created some sort of RESTful API backend for a mobile app. It's written completely in Java using Jersey as Framework. At the moment no database is used, it's all in the memory, but this is no problem so far (it's only for prototyping purposes). I ordered the smallest package from digital ocean and installed tomcat7. All in all tomcat works, but I have three major problems: 1) It takes a long time until tomcat deploys the app: I deploy it per tomcat manager and it takes about 2 minutes unit the site works (excl. war upload time). 2) The war files are quite big (16MB): I don't know why they are so big. There are no database dependencies and most logic is written in plain java. Okay, we are using jersey, but 16MB are a lot for the logic of a small webservice. 3) I have to restart tomcat all 3 days or so. It looks like a memory leak or something similar. If the app runs for a few days the response time is quite high and the server seems to be frozen. It works again, if I restart tomcat per ssh. You can find my mvn pom file right here. Do you have some tips? Are there good tomcat alternatives?

    Read the article

  • How can one use online backup with large amounts of static data?

    - by Billy ONeal
    I'd like to setup an offsite backup solution for about 500GB of data that's currently stored between my various machines. I don't care about data retention rates, as this is only a backup of, not primary storage, for my data. If the backup is stored on crappy non-redundant systems, that does not matter. The data set is almost entirely static, and mostly consists of things like installers for Visual Studio, and installer disk images for all of my games. I have found two services which meet most of this: Mozy Carbonite However, both services impose low bandwidth caps, on the order of 50kb/s, which prevent me from backing up a dataset of this size effectively (somewhere on the order of 6 weeks), despite the fact that I get multiple MB/s upload speeds everywhere else from this location. Carbonite has the additional problem that it tries to ignore pretty much every file in my backup set by default, because the files are mostly iso files and vmdk files, which aren't backed up by default. There are other services such as EC2 which don't have such bandwidth caps, but such services are typically stored in highly redundant servers, and therefore cost on the order of 10 cents/gb/month, which is insanely expensive for storage of this kind of data set. (At $50/month I could build my own NAS to hold the data which would pay for itself after ~2-3 months) (To be fair, they're offering quite a bit more service than I'm looking for at that price, such as offering public HTTP access to the data) Does anything exist meeting those requirements or am I basically hosed?

    Read the article

  • What are the IR codes the new Apple Remote (alu) uses?

    - by index
    I would like to clone the new Apple Remote (infrared, second generation, aluminium) just for fun with a microcontroller. Most codes of the previous model can be found in the LIRC remote control database (all except the key combinations menu + <<,play, which unpair, change ID, pair the remote. I also don't know which bit encodes the battery status. It uses a modified 32 bit NEC protocol (reverse LIRC codes bytewise). But the new Apple remote uses two additional codes for the play and the new select button. I don't have a mac, so I can't brute force test codes either ;-) So if someone possesses such a remote and the ability of recording those two new buttons and three combinations I'd really appreciate it. If you can't run LIRC (or it gets confused by the new codes) and you don't have an oscilloscope or logic analyser, maybe you could hook up a photo diode to your sound input and record the codes with Audacity? Just hit record, hit each button and combo a few times, hit stop, upload the uncompressed WAV file to a sharing site, done. That'd be great!

    Read the article

  • Lighttpd mod_accesslog not logging fastcgi requests

    - by zepatou
    I have recently installed a lighttpd for serving a python script via mod_fastcgi. Everything works fine except that I don't get the requests handled by mod_fastcgi logged in the access.log file (requests on port 80 are logged though). My lighttpd version is 1.4.28 on a Debian 6.0. I used the same working configuration a Ubuntu server 10.04 with lighttpd 1.4.26 and it worked. Here is my config lighttpd.conf server.modules = ( "mod_access", "mod_alias", "mod_accesslog", "mod_compress", ) server.document-root = "/var/www/" server.upload-dirs = ( "/var/cache/lighttpd/uploads" ) server.errorlog = "/home/log/lighttpd/error.log" index-file.names = ( "index.php", "index.html", "index.htm", "default.htm", "index.lighttpd.html" ) accesslog.filename = "/home/log/lighttpd/access.log" url.access-deny = ( "~", ".inc" ) static-file.exclude-extensions = ( ".php", ".pl", ".fcgi" ) server.pid-file = "/var/run/lighttpd.pid" include_shell "/usr/share/lighttpd/create-mime.assign.pl" include_shell "/usr/share/lighttpd/include-conf-enabled.pl" conf-enabled/10-fastcgi.conf server.modules += ( "mod_fastcgi" ) fastcgi.server = ( "/" => ( ( "min-procs" => 1, "check-local" => "disable", "host" => "127.0.0.1", # local "port" => 3000 ), ) ) Any idea ?

    Read the article

  • Super slow opening my downloads folder

    - by Mark
    I have an exe file in my download folder that I half downloaded through utorrent (it's not piracy, a legit file from people who use bittorrent to distribute large files). I think I tried to open it while it was still sharing, that is, did not stop the upload. That actually froze my computer. When I restart in utorrent I set the file to be deleted. Unfortunately even though utorrent doesn't see that file anymore, it's still visible in my download folder. Whenever I try to open my download folder it literally takes 10 minutes or more. It opens, but is empty and the blue progress bar needs a long time to complete. After completion I can use the download folder normally, but opening and closing things in that folder takes a long time. I see the exe that I tried to download. I tried to delete it. But it was taking so long 30+ minutes that I eventually just hit cancel. That doesn't even work, and it was slowing down the computer. Couldn't figure out how to stop the delete so I just pulled the plug. Should I just forget about that dl folder and set a new one? Is there something I can do? Thanks.

    Read the article

  • Good Enough Failover Strategy for DNS / MySQL / Email

    - by IMB
    I've asked and read a lot questions regarding DNS failover but the more I read the more complicated it becomes, some people say it's good enough some say it isn't. No clear answers from what I read. I was wondering if we can set it straight once and for all, at least for the requirements of most websites out there. Right now let's assume the following: We don't need really need load-balancing, what we need is a failover solution. We are running a website based on LAMP on a VPS. We need to make sure that the Web Server, MySQL, Email are always accessible if not 99%. Basically here's my idea and questions about it: Web Server: We need at least one failover server (another VPS on a separate data center). Is DNS Failover via Round Robin good, if not, what's the best? And how do you exactly implement it? How do you make the files you upload/delete on Server A is also on Server B? MySQL: I've only read a brief intro to MySQL replication and I assume that I can replicate Server A to Server B and vice versa on the fly right? So just it case Server A fails and Server B is now running, it will continue to work and replicate to Server A when it becomes available. So in essence Server B is now the primary server, and will later on failover to Server A, should a failure happen again. Email: If we are gonna use DNS Failover, using webmail or relying on emails stored on the server is probably not a good idea right? Since some emails might be on Server A while some might be on Server B? I assume a basic email forwarder to a 3rdparty is good enough (like Gmail for example) to ensure all emails are kept in one place. Here's a basic diagram for a better picture: http://i.stack.imgur.com/KWSIi.png

    Read the article

  • How to install/configure ffmpeg to compress mp4 videos for flash player delivery?

    - by Andrew Fulton
    We have a flash web-app that created interactive video, and are using ffmpeg to do some compression/resizing when a user "publishes" their project. The user can upload flv files and mp4 files, both of which play fine in the Flash UI before publishing. After publishing the flv files work fine, but the mp4 files will not play in the flash player: Audio will play but video won't. The mp4 files will play fine if I download them and play them in the Quicktime player but if I attempt to open them in the Adobe Media Player it reports "The media file does not contain a supported video track". If I open the Movie inspector in quicktime it tells me that the original file is an "h264" video and the ffmpeg-processed ones are "mpeg-4". I have tried forcing it to h264 by adding flags like -f h264 and -vcodec h264 but I get a screenfull of errors (no frame, illegal POC type, sps_id out of range) ending with Could not find codec parameters (Video: h264) h264 will show up if I run ffmpeg -formats and ffmpeg -codecs, and as I said it will play fine in Quicktime. Is there anything else I need to do to convince the flash player to play them? Is there anything else I need to tell you about the server that will help?

    Read the article

  • How to restart boot Windows 7 after upgrading to a SSHD on SONY VAIO with recovery discs?

    - by Boris Okun
    The original HDD on my Sony VAIO still works, but has a damaged sector 0 and I was constantly prompted to replace the HDD because of the imminent failure. I created recovery discs as instructed, used a USB external HDD for complete back up (including Windows image back up). After installing the SSHD and using recovery discs to upload Windows and boot, I am getting the Windows welcome screen. Right after that, I'm getting the following message: Windows couldn't complete the installation. To install Windows on this computer, restart the installation. I have tried repeating the process many times all kinds of different ways and I still receive the same message. Also, when I tried to change to partitioning as the other option offered, I get the message: Windows Setup could not configure Windows to run on this computer's hardware. All troubleshooting for hardware and PCU came out solid. I tried to load the image back up from the external drive, but can't load the driver. The computer doesn't see it. Does anyone have a clue or has encountered something similar?

    Read the article

  • Photoshop CS5 performance over network drive (cifs)

    - by grub
    Hello Everyone I did install a QNAP NAS TS410 for a customer (professional photographer) with 3 Hitachi Deskstar 7200rpm 2TB disk configured as RAID5. The NAS and the workstations are connected over a Gigabit network. He and his co-worker are accessing the photos (about 1TB of photos) over a mapped network drive from their windows machines (Windows XP - 32bit and Windows 7 Ultimate - 32bit). Both are using Photoshop CS5 to edit the photos. The problem is that to save a edited photo takes a really long time, it takes about 3 times as long to save a photo as to open it. After some tests I can exclude the network, the NAS and the windows machines as source of the issue. I think the problem is the Photoshop software and its handling of the network drives. Officially network drives are not supported by Adobe. I do not have any experience with the Adobe products, especially with Adobe Photoshop CS5. What are your recommendation to solve the performance issue? Should my customer copy the photos to the local drive, edit them and upload them again to the network drive or is Adobe Drive or Adobe Version Cue the answer? One requirement is that the photos need to be accessible / editable from both computers even when one of them is offline. Adobe Version Cue needs a dedicated service running to be usable, so this solution is not possible as far as I understand the Cue software. Thank you for your input to this issue and have a nice day :-) Greetings grub

    Read the article

  • How to make an x.509 certificate from a PEM one?

    - by Ken
    I'm trying to test a script, locally, which involves uploading a file using a Java-based program to a FileZilla FTPES server. For the real thing, there is a real certificate on the FZ server, and the upload step (tested alone) seems to work fine. I've installed FileZilla Server on my dev box (so it'll test uploading from localhost to localhost). I don't have a real certificate for it, of course, so I used the "Generate new certificate..." button in FZ. It works fine from an interactive FTPES program (as long as I OK the unknown cert), but from my Java program it throws a javax.net.ssl.SSLHandshakeException ("unable to find valid certification path to requested target"). So how do I tell Java that this certificate is OK with me? (I know there's a way to change the Java program to accept any certificate, but I don't want to go down that route. I want to test it just as it will happen in production, and I don't want to ignore unknown certificates in production.) I found that Java has a program called "keytool" that seems to be for managing this sort of thing, but it complains that the certificate file that FZ generated is not an "x.509" file. A posting from the FZ side said it was "PEM encoded". I have "openssl" here, which looks like it's perfect for converting between certificate formats, but I think my understanding of certificate formats is wrong because I'm not seeing anything obvious. My knowledge of security certificates is a bit shaky, so if my title is stupidly wrong, please help by fixing that. :-)

    Read the article

  • how to remove background layer of djvu file

    - by Jon
    Hello, I've downloaded some files from the Internet Archive. They come in different file formats and most of the time I use pdf. However, sometimes the scans are saves in colour instead of b/w. This makes it difficult/impossible to read on a dedicated ebook reader. In that case I downloaded the djvu files as on the PC you can select which layer (color, bw,fore,back) one would like to see. Selecting the bw gives excellent results. However, the ebook reader does not has this option. The question is, how can I remove /extract a layer from the djvu file and save only this layer. So far I've tried the following two approaches: 1) select bw in the djvu viewer on the PC and printed to postscript file. Followed by a ps2pdf conversion. This works, but generates a fairly large pdf file. Sure, I can again upload it to any2djvu but it just seems to much manual work for each file. 2) I tried the shared annotation feature and said (mode bw). This works on the PC as desired but is ignored on the ebook reader as the other layers are still present. Any help or suggestions would be greatly appreciated.

    Read the article

  • Converting Audio To Video Output and Attaching Text?

    - by ZeeMan
    I am currently working on a project and before i get started i thought it'd be nice to check with stackOverflow community, and see maybe they can help me with this. The Idea: I have about a thousand MP3 files that i need to convert into Video files to be upload on Youtube for my work. Here is where it gets tricky i need to also attach the Text associated with the Audio to the Video as an Image. I was thinking .ppt. The Problem: I can do this one audio file at a time but it would take me a zillion years. lol!! The Question: Can I Create Some Kind Of Program Using Let's Say XML or JavaScript Or XHTML or some other programming language to do a MASS content creation and all i have to do is feed it the Information?? possibly a script?? or is it possible to create an example .ppt file and then hack it so that i can have it reproduce itself with different information?? The Note: Thanks U In Advance For Helping Out!!! Regards, ZeeMan!!!

    Read the article

  • SSLVerifyClient optional with location-based exceptions

    - by Ian Dunn
    I have a site that requires authentication in order to access certain directories, but not others. (The "directories" are really just rewrite rules that all pass through /index.php) In order to authenticate, the user can either login with a standard username/password, or submit a client-side x509 certificate. So, Apache's vhost conf looks something like this: SSLCACertificateFile /etc/pki/CA/certs/redacted-ca.crt SSLOptions +ExportCertData +StdEnvVars SSLVerifyClient none SSLVerifyDepth 1 <LocationMatch "/(foo-one|foo-two|foo-three)"> SSLVerifyClient optional </LocationMatch> That works fine, but then large file uploads fail because of the behavior documented in bug 12355. The workaround for that is to set SSLVerifyClient require (or optional) as the default, so now the conf looks like this SSLCACertificateFile /etc/pki/CA/certs/redacted-ca.crt SSLOptions +ExportCertData +StdEnvVars SSLVerifyClient optional SSLVerifyDepth 1 <LocationMatch "/(bar-one|bar-two|bar-three)"> SSLVerifyClient none </LocationMatch> That fixes the upload problem, but the SSLVerifyClient none doesn't work for bar-one, bar-two, etc. Those directories are still prompted to present a certificate. Additionally, I also need the root URL to accessible without the user being prompted for a certificate. I'm afraid that will cancel out the workaround, though.

    Read the article

  • Transfering Files to server IP and port

    - by Mason
    I need to transfer files from my local computer on windows 7 to a server running linux. I access the server with putty through ssh at a specific IPv4 address and port number. I've attempted using the pscp command from my local computer but was denied access by the server. "Fatal: Network error: Connection refused" c:>pscp test.csv userid@**IPv4_Addres***:Port# /path/destination_file_name. Either the server blocks all pscp attempts from unauthorized users (most likely my laptop included) or I used the command incorrectly. If you have experience using this command, where exactly will the file get transfered to, I'm assuming that the path destination starts at my home directory in the server. Also if you have any other alternative methods of transfering the files let me know. Update 1 I have also tried using WinSCP however I got permission denied for that as well, it looks like the server will not let me upload or save files. Solved I had a complete lapse of memory and forgot about sudo (spent too much time with scripts the last 2 months), so I was able to change the permissions to allow external editing. Thanks for all the help guys!

    Read the article

  • How to monitor bandwidth use of each device on wifi network

    - by GWLlosa
    I have in my home a standard Comcast cable internet connection. I have it going from the wall to a cable modem, and from the modem to a late-series Linksys router, which provides wired and wireless networking. The vast majority of the users are wireless connections. For day-to-day tasks, this connection is fully sufficient for all my needs. However, on regular occassions, we have social gatherings that involve many people bringing laptops and other PCs and using the network and internet simultaneously, frequently for gaming. I have no administrative oversight over these machines; they have been known to be riddled with spyware and/or bloatware or be running torrents, legal or otherwise. The only reason I care is that on a regular basis, one of the machines will flatline my internet bandwith, and consume it all in order to upload/download/spam people/whatever. When this happens, the latency of the connections for gaming and the like becomes unacceptable, and everyone suffers. My question is: Is there a system I can set up whereby I can easily monitor the various systems connected to my wireless connection, see how much bandwith each one is using, and for what ends? That way, at a glance, I can spot the offending machine and kick it from the connection, without having to go from machine to machine, checking each one's "bandwith used" properties manually, and dealing with the owner's indignant protests all the while. I understand this will likely involve 3rd-party software and/or hardware; my issue is I don't even know where to begin.

    Read the article

  • Inheriting file ownership on linux

    - by John Hunt
    We have an ongoing problem here at work. We have a lot of websites set up on shared hosts, our cms writes many files to these sites and allows users of the sites to upload files etc.. The problem is that when a user uploads a file on the site the owner of that file becomes the webserver and therefore prevents us being able to change permissions etc via FTP. There are a few work arounds, but really what we need is a way to set a sticky owner if that's possible on new files and directories that are created on the server. Eg, rather than php writing the file as user apache it takes on the owner of the parent directory. I'm not sure if this is possible (I've never seen it done.) Any ideas? We're obviously not going to get a login for apache to the server, and I doubt we could get into the apache group either. Perhaps we need a way of allowing apache to set at least the group of a file, that way we could set the group to our ftp user in php and set 664 and 775 for any files that are written? Cheers, John.

    Read the article

  • Issues Deploying Functional WAR to Elastic Beanstalk with Tomcat7

    - by BFar
    I am currently deploying OpenTripPlanner (http://github.com/OpenPlans/OpenTripPlanner.git) to Elastic Beanstalk. I'm able to successfully build and deploy opentripplanner with my own customized settings on an ec2. I have set it up so that the appropriate WAR file can be placed in the Tomcat/Webapps folder, and when Tomcat is started up, it will auto-deploy, and even download open trip planner's graph.obj from an S3. All of that works just fine, except when I try to deploy to Elastic Beanstalk. When I upload to Elastic Beanstalk, the log shows that my WAR file is successfully unpacked & successfully downloads the graph.obj from my S3. The only difference is that then nothing happens and I can't load the site in my browser. The health is RED, and I can't figure out what is going on. I've tried looking into ports and dns issues, but I can't determine what's wrong. Anyone have any ideas? Why would a WAR that works on tomcat7 outside of Beanstalk fail to be accessible?

    Read the article

  • IIS FTP 7.5 Data Channel Problem (SSL)

    - by user59050
    Hey there I wonder if anyone can get me in the right direction. I am setting up both a FTPS Client and Server, FTPS Server using Microsoft’s iis FTP 7.5. On the client side it will be running on Linux and I am using M2crypto for the openssl wrapping (python). I am worried the problem is on the server side (iis7.5) due to the following discovery : If I host using Filezilla with BOTH the control and data channel being forced to be encrypted it works 100% (100% file transmission), if i use iis as the server everything works up to the point when the data channel takes over... i.e. all data of the retrieved file is already received correctly in my basket! The ftp server just won't send the final '226 Transfer complete.' on the cmd socket. Why? If i force the client or server to close the connection the file is 100% intact....If i use iis 7.5 with forced encryption on control channel all works 100% as long as i don’t force data channel... Here are some screenshots to demo this... Client View after Kill Client : pics @ http://forums.iis.net/p/1172936/1960994.aspx#1960994 Summary : We can establish the connection, do directory listings, start the upload, see the file (0bytes) created on the server but then the client hangs. If we terminate the client, the uploaded file on the server suddenly jumps up to full size.

    Read the article

  • How To Monitor Home Wireless Network Connected Devices Bandwith

    - by GWLlosa
    (Originally posted on SuperUser, not sure if it might be better suited here) I have in my home a standard Comcast cable internet connection. I have it going from the wall to a cable modem, and from the modem to a late-series Linksys router, which provides wired and wireless networking. The vast majority of the users are wireless connections. For day-to-day tasks, this connection is fully sufficient for all my needs. However, on regular occassions, we have social gatherings that involve many people bringing laptops and other PCs and using the network and internet simultaneously, frequently for gaming. I have no administrative oversight over these machines; they have been known to be riddled with spyware and/or bloatware or be running torrents, legal or otherwise. The only reason I care is that on a regular basis, one of the machines will flatline my internet bandwith, and consume it all in order to upload/download/spam people/whatever. When this happens, the latency of the connections for gaming and the like becomes unacceptable, and everyone suffers. My question is: Is there a system I can set up whereby I can easily monitor the various systems connected to my wireless connection, see how much bandwith each one is using, and for what ends? That way, at a glance, I can spot the offending machine and kick it from the connection, without having to go from machine to machine, checking each one's "bandwith used" properties manually, and dealing with the owner's indignant protests all the while. I understand this will likely involve 3rd-party software and/or hardware; my issue is I don't even know where to begin.

    Read the article

  • hyper-v cluster behavior when losing network connectivity

    - by ChristopheD
    Setup: (rather new) Hyper-V R2 cluster with 2 nodes (in failover configuration). Fysical host OS: Windows Server 2008. About eight VM's (mixed: Windows Server 2008 and Linux) Yesterday we had a power outage of about 15 minutes. Our blades are on UPS so the fysical host machines (Windows Server 2008) never went down. Our main switches are not on UPS (yet) and we saw the behaviour similar to the following (as distilled from the event logs). The nodes in the cluster lost means of communication (because the external switches went down). The cluster wants to bring down one (the first) of the nodes (to start failover?). The previous step impacts clustered storage where the virtual machine VHD's are located. All VM's got brutally terminated and were found in a failed state in the failover manager in the host OS'es. The Linux VM's were kernel panicking and looked like they had their disk ripped out. This whole setup is rather new to us, so we are still learning about this. The question: We are putting switches on UPS soon but were wondering if the above is expected behavior (seems rather fragile) or if there are obvious improvements configuration-wise to handle such scenario's ? I can upload an evtx file concerning what exactly was going on in case that's necessary.

    Read the article

  • which is best smart automatic file replication solution for cloud storage based systems.

    - by TORr0t
    I am looking for a solution for a project i am working on. We are developing a websystem where people can upload their files and other people can download it. (similar to rapidshare.com model) Problem is, some files can be demanded much more than other files. The scenerio is like: I have uploaded my birthday video and shared it with all of my friend, I have uploaded it to myproject.com and it was stored in one of the cluster which has 100mbit connection. Problem is, once all of my friends want to download the file, they cant download it since the bottleneck here is 100mbit which is 15MB per second, but i got 1000 friends and they can only download 15KB per second. I am not taking into account that the hdd is serving same files. My network infrastrucre is as follows: 1 gbit server(client) and connected to 4 Nodes of storage servers that have 100mbit connection. 1gbit server can handle the 1000 users traffic if one of storage node can stream more than 15MB per second to my 1gbit (client) server and visitor will stream directly from client server instead of storage nodes. I can do it by replicating the file into 2 nodes. But i dont want to replicate all files uploadded to my network since it is costing much more. So i need a cloud based system, which will push the files into replicated nodes automatically when demanded to those files are high, and when the demand is low, they will delete from other nodes and it will stay in only 1 node. I have looked to gluster and asked in their irc channel that, gluster cant do such a thing. It is only able to replicate all the files or none of the files. But i need it the cluster software to do it automatically. Any solutions ? (instead of recommending me amazon s3) S

    Read the article

  • User permissions linux. (proftpd / nginx)

    - by user55745
    I've been having a complete nightmare trying to configure proftpd. I've got proftp server working with an sql database. However I want to have any files uploaded able to viewed by the webserver running on the same box. The folders get created in /var/tmp/ as rwx------ 2 ftpuser ftpgroup 4096 Oct 8 20:35 50730c4346512 drwx------ 2 ftpuser ftpgroup 4096 Oct 8 20:38 50730f3a811ca I've tried adding www-data to group with the following usermod -g www-data ftpuser But this doesn't allow the web server access. In proftpd.conf I have the following umask set Umask 0022 It doesn't seem to make a difference what I set that value to. /etc/group (sure I've messed up one of these two but I'm getting desperate) ftpgroup:x:2001:www-data www-data:x:33:ftpgroup /etc/passwd www-data:x:33:33:www-data:/var/www:/bin/sh proftpd:x:108:65534::/var/run/proftpd:/bin/false ftp:x:109:65534::/srv/ftp:/bin/false ftpuser:x:2001:33:proftpd user www-data:/bin/null:/bin/false The ftpuser table in the database has uid / gid set to 2oo1 for both. I'm going absolutely crazy trying to solve this any help would be greatly appreciated. p.s Also, although if I manually connect to the ftp server I can upload files via FileZilla. Although this isn't working for the web-camera, although there is talky talky going on between the server and the camera.

    Read the article

  • How can I install .NET framework 3.5 on XP machines without internet connection?

    - by EricSchaefer
    I want to install .NET framework 3.5 on a couple of machines that do not have internet access. If I install the "no internet access"-package it still wants to download something. How can I figure out what is missing? Are there other installation packages? Edit: I would present screenshots but I cannot upload anything from here and the shots would be in german. So I present only the text translated back to english... Installing the "full redistributable package": At the bottom of the license agreement page it display this text: Size of download file: 67 MB Appoximate download time: 2h 44min (56KBit/s) 18min (512KBit/s) It shows the text even if I installed Windows Installer 3.1. After agreeing it displays the "Download and Installation Status"-Dialog with a progress bar labeled "Download:" and Status: Connection to server attempted (try X of 5). Total Download Status: 56MB/67MB I tried it in a VM with no network connection. It tries 5 times while the progress bar shows progress. Later the progress bar is labeled "Installation:". Even later it reports problems during setup and provides two buttons "Send Report Later" and "Don't Send". Now here it comes: "Setup completed" and "Microsoft .NET Framework 3.5 has been deinstalled successfully." (Emphasis is mine) "It is recommended to install current service packs and security updates. More information at Windows Update (link)." Edit2: Installed Service Pack 3, but still no success.

    Read the article

< Previous Page | 407 408 409 410 411 412 413 414 415 416 417 418  | Next Page >