Search Results

Search found 74791 results on 2992 pages for 'remote file inclusion'.

Page 46/2992 | < Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >

  • What is the max connections via remote desktop for a small server?

    - by Jay Wen
    I have a small server running MS Server 2012. The CPU is a Xeon E3-1230 V2 @ 3.30GHz, 4 Cores, 8 Logical Processors, 8 GB RAM. Main HD is a Samsung 840, and the big storage is a 4 disk WD Black Raid 10 Array in a Synology NAS enclusure. My question is: given this hardware, approximately how many users can the system support via "Remote Desktop Connection"? Assume there are no licensing limits. These are not admin users. I know there is a two admin limit. This boils down to: What resources does one remote connection require? RAM? % of the CPU? Networking bandwidth? I guess the base case would be for a conection where the user is inactive or simply browsing cnn. Once you know this, you know how many you could fit on the machine before something is maxed-out. In reality, users would be mostly on Excel (multi-MB spreadsheets). I know the approx. resources currently required by each copy of Excel.

    Read the article

  • Copied a file with winscp; only winscp can see it

    - by nilbus
    I recently copied a 25.5GB file from another machine using WinSCP. I copied it to C:\beth.tar.gz, and WinSCP can still see the file. However no other app (including Explorer) can see the file. What might cause this, and how can I fix it? The details that might or might not matter WinSCP shows the size of the file (C:\beth.tar.gz) correctly as 27,460,124,080 bytes, which matches the filesize on the remote host Neither explorer, cmd (command line prompt w/ dir C:\), the 7Zip archive program, nor any other File Open dialog can see the beth.tar.gz file under C:\ I have configured Explorer to show hidden files I can move the file to other directories using WinSCP If I try to move the file to Users/, UAC prompts me for administrative rights, which I grant, and I get this error: Could not find this item The item is no longer located in C:\ When I try to transfer the file back to the remote host in a new directory, the transfer starts successfully and transfers data The transfer had about 30 minutes remaining when I left it for the night The morning after the file transfer, I was greeted with a message saying that the connection to the server had been lost. I don't think this is relevant, since I did not tell it to disconnect after the file was done transferring, and it likely disconnected after the file transfer finished. I'm using an old version of WinSCP - v4.1.8 from 2008 I can view the file properties in WinSCP: Type of file: 7zip (.gz) Location: C:\ Attributes: none (Ready-only, Hidden, Archive, or Ready for indexing) Security: SYSTEM, my user, and Administrators group have full permissions - everything other than "special permissions" is checked under Allow for all 3 users/groups (my user, Administrators, SYSTEM) What's going on?!

    Read the article

  • C++, Ifstream opens local file but not file on HTTP Server

    - by fammi
    Hi, I am using ifstream to open a file and then read from it. My program works fine when i give location of the local file on my system. for eg /root/Desktop/abc.xxx works fine But once the location is on the http server the file fails to open. for eg http://192.168.0.10/abc.xxx fails to open. Is there any alternate for ifstream when using a URL address? thanks. part of the code where having problem: bool readTillEof = (endIndex == -1) ? true : false; // Open the file in binary mode and seek to the end to determine file size ifstream file ( fileName.c_str ( ), ios::in|ios::ate|ios::binary ); if ( file.is_open ( ) ) { long size = (long) file.tellg ( ); long numBytesRead; if ( readTillEof ) { numBytesRead = size - startIndex; } else { numBytesRead = endIndex - startIndex + 1; } // Allocate a new buffer ptr to read in the file data BufferSptr buf (new Buffer ( numBytesRead ) ); mpStreamingClientEngine->SetResponseBuffer ( nextRequest, buf ); // Seek to the start index of the byte range // and read the data file.seekg ( startIndex, ios::beg ); file.read ( (char *)buf->GetData(), numBytesRead ); // Pass on the data to the SCE // and signal completion of request mpStreamingClientEngine->HandleDataReceived( nextRequest, numBytesRead); mpStreamingClientEngine->MarkRequestCompleted( nextRequest ); // Close the file file.close ( ); } else { // Report error to the Streaming Client Engine // as unable to open file AHS_ERROR ( ConnectionManager, " Error while opening file \"%s\"\n", fileName.c_str ( ) ); mpStreamingClientEngine->HandleRequestFailed( nextRequest, CONNECTION_FAILED ); } }

    Read the article

  • How do I open a file in such a way that if the file doesn't exist it will be created and opened automatically?

    - by snakile
    Here's how I open a file for writing+ : if( fopen_s( &f, fileName, "w+" ) !=0 ) { printf("Open file failed\n"); return; } fprintf_s(f, "content"); If the file doesn't exist the open operation fails. What's the right way to fopen if I want to create the file automatically if the file doesn't already exist? EDIT: If the file does exist, I would like fprintf to overwrite the file, not to append to it.

    Read the article

  • Problem with File uplolad in javascript.

    - by Nikhil
    I have used javascript to upload more than one file. when user clicks on 'add more' javascript appends new object to older div using innerHTML. Now the problem is if I select a file and then click on "add more" then new file button exist but older selected file removes and two blank file buttons display. I want this old file must be selected when user add new file button. If anybody can, Help Plz!!! tnX.

    Read the article

  • What is the best solution for remote desktop / visual support?

    - by SchizoDuckie
    We are currently investigating different remote-desktop support solutions to help our clients if they have any problems with our software and I would like some input on the best solutions out there. We have the following needs / wishes: Cross platform Preferrably no installation on the user-end Should penetrate firewalls and not be bothered by antivirus stuff. Should leave no residu behind after support. I know of VNC, logmeinrescue.com, dameware remote control, msn remote desktop and many others, but which one is the best?

    Read the article

  • excel cannot open the file xxx.xlsx' because the file format is not valid error

    - by Yavuz
    I have difficulty open opening word and excel files suddenly. Only particular office file give me the problem. These files were previously scanned by combo fix and I believe they were damaged. The error response that I from office is Excel cannot open the file xxx.xlsx because the file format is not valid. Verify that the file has not been corrupted and that the file extension matches the format of the file. This is for excel and a similar kind of error response comes for word. The file looks fine. I mean the size vise... Please help me with this problem. I really appreciate your help and time....

    Read the article

  • Upon clicking on a file, excel opens but not the file itself

    - by william
    Platform: Windows XP SP2, Excel 2007 Problem description: Upon clicking on a file in Windows Explorer (file is either .xls or .xlsx) Excel 2007 opens, but does not open the file itself. I need either to click on a file again in Windows Explorer or open it manually with File/Open ... from Excel. Does anyone know what could cause this rather strange behaviour ? The old versions of Excel worked "normally" ... i.e. upon clicking on a file, an Excel would open along with the file. Please, help !

    Read the article

  • How to write files in specific order?

    - by Bernie
    Okay, here's a weird problem -- My wife just bought a 2014 Nissan Altima. So, I took her iTunes library and converted the .m4a files to .mp3, since the car audio system only supports .mp3 and .wma. So far so good. Then I copied the files to a DOS FAT-32 formatted USB thumb drive, and connected the drive to the car's USB port, only to find all of the tracks were out of sequence. All tracks begin with a two digit numeric prefix, i.e., 01, 02, 03, etc. So you would think they would be in order. So I called Nissan Connect support and the rep told me that there is a known problem with reading files in the correct order. He said the files are read in the same order they are written. So, I manually copied a few albums with the tracks in a predetermined order, and sure enough he was correct. So I copied about 6 albums for testing, then changed to the top level directory and did a "find . music.txt". Then I passed this file to rsync like this: rsync -av --files-from=music.txt . ../Marys\ Music\ Sequenced/ The files looked like they were copied in order, but when I listed the files in order of modified time, they were in the same sequence as the original files: ../Marys Music Sequenced/Air Supply/Air Supply Greatest Hits ls -1rt 01 Lost In Love.mp3 04 Every Woman In The World.mp3 03 Chances.mp3 02 All Out Of Love.mp3 06 Here I Am (Just When I Thought I Was Over You).mp3 05 The One That You Love.mp3 08 I Want To Give It All.mp3 07 Sweet Dreams.mp3 11 Young Love.mp3 So the question is, how can I copy files listed in a file named music.txt, and copy them to a destination, and ensure the modification times are in the same sequence as the files are listed?

    Read the article

  • rename files with the same name

    - by snorpey
    Hi. I use the following function to rename thumbnails. For example, if I upload a file called "image.png" to an upload folder, and this folder already has a file named "image.png" in it, the new file automatically gets renamed to "image-copy-1.png". If there also is a file called "image-copy-1.png" it gets renamed to "image-copy-2.png" and so on. The following function returns the new filename. At least that's what it is supposed to do... The renaming doesn't seeem to work correctly, though. Sometimes it produces strange results, like: 1.png 1-copy-1.png 1-copy-2.png 1-copy-2-copy-1.png 1-copy-2-copy-3.png I hope you understand my problem, despite my description being somewhat complex... Can you tell me what went wrong here? (bonus question: Is regular expressions the right tool for doing this kind of stuff?) <?php function renameDuplicates($path, $file) { $fileName = pathinfo($path . $file, PATHINFO_FILENAME); $fileExtension = "." . pathinfo($path . $file, PATHINFO_EXTENSION); if(file_exists($path . $file)) { $fileCopy = $fileName . "-copy-1"; if(file_exists($path . $fileCopy . $fileExtension)) { if ($contains = preg_match_all ("/.*?(copy)(-)(\\d+)/is", $fileCopy, $matches)) { $copyIndex = $matches[3][0]; $fileName = substr($fileCopy, 0, -(strlen("-copy-" . $copyIndex))) . "-copy-" . ($copyIndex + 1); } } else { $fileName .= "-copy-1"; } } $returnValue = $fileName . $fileExtension; return $returnValue; }?>

    Read the article

  • Basic question on c++ header file inclusion ?

    - by siva
    What are the differences between below 3 programs ?. Is <iostream> a header file or C++ standard library ? 1 #include<iostream> using namespace std; int main() { return 0; } 2 #include<iostream> int main() { return 0; } 3 #include<iostream.h> int main() { return 0; } Thanks in advance.

    Read the article

  • What Remote Desktop Solution Do You Use To Service Your Clients' PCs? [closed]

    - by Sootah
    Possible Duplicate: What’s the best Remote Desktop Application? I am the owner of a local computer repair business that primarily services its clients on-site. On the occasions that we do service the machines in the office we generally have one of our techs pick the computer up while they are out and about and bring it back with them. Only rarely will we require the customer to bring us the computer themselves. In order to reduce costs, be much more efficient, and potentially expand our market far beyond what would be feasible with travel required; I am looking at ways that we can service our clients remotely whenever possible. What we're in need of is a solid remote desktop application that will be incredibly easy for our customers to connect to, as well as be robust enough that we don't need the client babysitting the computer during the entire repair. Ideally I would like to use a web-based solution so that we don't have to walk the customers through installing, connecting, and configuring it over the phone. This would be unacceptable because of the level of service they are used to. Effectively we'd want them to be able to just go to a URL, enter a PIN or something, and then they are connected and ready to rumble. (Obviously the option to just email them a link that'd do all this for them would be what we'd be aiming for) Along with the ease of use factor, we would need the product to not require any further intervention on the part of the client after we have connected. Nobody is going to be happy if we have to call them every 15 minutes so they can reconnect to us every time we reboot - so auto-reconnect is an absolute must. The only product I know of right now that does any of this is LogMeIn Rescue. It allows unattended access, the applet is lightweight and installs quickly, and the customer can either enter a PIN on the site or just click a link emailed to them in order to connect. The only real downside I see to LogMeIn Rescue is that it's $120.00/month per technician. While we'd ultimately end up saving far more than that per month just in fuel costs alone, I'd like to explore any other options out there that I may not have come across. Are there any equally good products out there? If so what are they, why do you recommend them, how have you been utilizing them yourself, and what do they cost?

    Read the article

  • Best Solution for Load Balancing NFS File Access?

    - by DairyKnight
    I'm trying to find an optimum solution for accessing the NFS file share in my company. We have a central file server in North America and has 30GB~50GB of updated data everyday. And it's very slow for our Europe and Asia branches to access directly. Therefore, I'm trying to setup two replicate servers in those continents. I'm currently using rsync, but wonder if there exists a better solution acts more like a distributed RAID, which allows the user to transparently access the file whether synced or not. And user request will be dispatched to remote server if the file is not yet synced. I'm now looking into DRBD, but it seems not to have the functionality of auto-dispatching requests. Does anyone know if there's a better solution?

    Read the article

  • Corrupt file indicative of corrupt hard drive?

    - by Elipsicon
    I have noticed that two files on my (almost full) 2 TB hard drive have been corrupted. One file has 20 kB (!) corrupted, i.e. consecutive 20 kB have changed, even though the modification date of the file hasn't changed and I haven't worked with this file for over a year. This tells me that something "below" the file system level has messed with the data and the only thing I can think of is hardware failure, most likely hard disk failure. I've tested my RAM already and it works flawlessly. I'm using ext4 on Linux, if that is of any help. Is this normal? Is it time to change my hard drive disk before something worse happens? What can I do to prevent that from happening in the future? Is there some built-in feature of, or extension to ext4 that includes additional error correcting code and/or watches files for changes that haven't been caused by the OS?

    Read the article

  • Puppet file transfer slow

    - by Noodles
    I have a puppet master and slaves in different datacenters. The latency between them is ~40ms. When I run "puppet agent --test" on a slave to apply the latest manifest it takes ~360 seconds to finish. After doing some digging I can see the main cause of the slow down is file transfers. It seems it's taking ~10 seconds to transfer each file. The files are only small (configuration files) so I can't understand why they would take so long. This is an example of a file in my manifest: file { "/etc/rsyncd.conf" : owner => "root", group => "root", mode => 644, source => "puppet:///files/rsyncd/rsyncd.conf" } Running puppet-profiler I see this: 10.21s - File[/etc/rsyncd.conf] It also seems I cannot update more than one server at once using puppet. If I run two servers at the same time then puppet takes twice as long. I have changed the puppet master from using webrick to mongrel, but this doesn't seem to help. This is making deploying changes painful. A simple config change can take an hour to roll out to all servers.

    Read the article

  • How to create a snapshot volume to a remote server using kvm?

    - by Purres
    I want to backup a few virtual machines to a backup server. Here're the backup steps. suspend a virtual machine create a snapshot of the virtual machine using lvcreate -s resume a virtual machine dd if=/virtual_machine_path | lzop > /temp/backup.lzo rsync /temp/backup.lzo -e "ssh " 1.2.3.4:/backup_path/ However, the hypervisor server doesn't have enough hard disk space to create a snapshot in step 2. Is there a way to create a logical volume snapshot to a remote server?

    Read the article

  • Windows File System Analysis

    - by bouvierr
    I am looking for a FREE tool to perform analyses on the NTFS file system of my Windows 7 PC. I want to easily see the amount of data distributed throught out the entire file system. The following applications seem very good, but they are not free and probably overkill for my requirements: FolderSizes 5 MailMeter Windows File System Reporting Tool I am aware that some applications (like Folder Size 2.5) can add a column in Windows Explorer to show the size of each folder, but I am looking for something more like a reporting tool. Thank you for your suggestions.

    Read the article

  • OSX: Selecting default application for all unknown and different file types (extensions)

    - by Leo
    I work in cluster computing and am using Mac OS X 10.6. I send off hundreds of computing jobs a day, and each one comes back with with a different extension. For example, svmGeneSelect.o12345 which is the std output of my svmGeneSelect job which is job number 12345. I don't control the extensions. All files are plain text. I want OSX to open any file extension that it hasn't seen before with my favorite text editor when I click on it. Or even better set up file association defaults for extension patterns ie textEdit for extensions matching *.o*. I do NOT want to create file associations for individual files since this extension will only ever exist once, and I do not want to go through the process of selecting the application to use for each file. Thanks for any help you can offer.

    Read the article

  • How do I remove the on screen keyboard from the logon screen in Windows Remote Desktop Server 2008 R

    - by Gomibushi
    The on screen keyboard (OSK) from the "ease of access" tools pops up on EVERY connect to the server, even if you have not activated it. I can't seem to find a control panel or reg setting to switch it off. It is VERY "in your face" for linux users who connect at lower resolutions and do not provide all credentials, but have to type username and password. I'm running a 2008 R2 Terminal Server/Remote Desktop Server.

    Read the article

< Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >