Search Results

Search found 77950 results on 3118 pages for 'large file upload'.

Page 130/3118 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • What are people using as Login scripts in large enterprises

    - by beakersoft
    Hi, We have recently been tasked with looking after the user login side of things in our enterprise (windows clients in active directory). We have a system at the moment that uses a vbscript login/loggof script to call a couple of DLL written in vb 6. The DLL's actions are controled by some config files based on users/groups witch is administrated from a central app. This is quite a good system, but kind of want to come away from vb6 for the dll's (maybe port them to c++ but them you have to make them com+ to call them from vbscripts etc) and possibly away from vbscript for the actual login scripts themselves. Just wondered what other people are using, what people can suggest etc Thanks Luke

    Read the article

  • BitTorrent Myth

    - by Moon .
    In BitTorent Statistics there is a field "Total Ratio" that is the ratio between total downloads and uploads. i have heard that this ratio affects BitTorrent'ss performance. If the ratio is better then BitTorrent Network provides you services on priority. And If the ratio is down (less uploads) then the BitTorrent provides you services on average or below average priorities. Is there something like that.....

    Read the article

  • Encrypt Data Prior to Upload

    - by TheW
    I'm looking to store some data online but I want to encrypt the files first. Since I understand that sFTP will only encrypt the transmission of the data, I'm wondering what program others use to encrypt their files prior to sFTPing them to a backup server. Thanks.

    Read the article

  • How do I remove partitions in the 'Places' section of the Gnome file dialog?

    - by Grumbel
    In the Gnome file dialog under 'Places' I can add and edit bookmarks to directories. That list however not only contains my bookmarks, but also a list of all partitions on the system that Gnome seems to gather automatically. I can't edit that list as the right-click-menu items for that are greyed out. How can I get rid of those automatically generated entries or limit it to just my bookmarks?

    Read the article

  • How to bring Paging File usage metric to zero?

    - by AngryHacker
    I am trying to tune a SQL Server. Per Brent Ozar's Performance Tuning Video, he says the PerfMon's Paging File:%Usage should be zero or ridiculously close to it. The average metric on my box is around 1.341% The box has 18 GB of RAM, the SQL Server is off, the Commit Charge Total is 1GB and yet the PerfMon metric is not 0. The Performance of the Task Manager states that PF Usage is 1.23GB. What should I do to better tune the box?

    Read the article

  • How to manage large number of desktop VMs?

    - by symcbean
    I'm looking at the feasibility of providing remote access to multiple virtual machines. The VMs themselves will provide user desktops. To make best use of the available resources, I'd like the VMs to hibernate when the user disconnects. Which implies being able to start them up when a user connects. Ideally each user would 'own' a VM image - but if not then I'd require that the session was terminated. Obviously this would require the remote access protocol to be tied into the VM management. Is there anything out there to provide such functionality? (extra credit for open protocols! ;)

    Read the article

  • How to extract file paths out of drag and drop event?

    - by trismarck
    I have this application that lists files in a WindowsForms listbox (NET framework). The application does not support the copy operation if multiple files are selected in the listbox, but at the same time, the application supports 'drag and drop' event for multiple files (allows dragging the files 'out of the application'). How can I extract the paths of the files 'dragged out of the application'? (i.e. I drop the files on some program / script that shows me the paths / saves the paths to a txt file).

    Read the article

  • How large administrators team should be? [closed]

    - by Artyom
    I'm trying to find an answer about how many server administrators/technicians are required to run a server farm with 7/24 availability of let's 10, 100, 1000 Linux servers? Are there any studies for this? Edit I was not expected this question to be closed. There are lots of studies about for example software development where from "lines of code" you can approximate the software development cost (COCOMO), so I was searching for something similar in administration. Note, I'm 100% understand that it is not a straightforward or easy to answer question, but it is a real question...

    Read the article

  • MySQL : table organisation for very large sets with high update frequency

    - by Remiz
    I'm facing a dilemma in the choice of my MySQL schema application. So before I start here is a picture extremely simplified of my database : Schema here : http://i43.tinypic.com/2wp5lxz.png In one sentence : for each customer, the application harvest text data and attached tags to each data collected. As approximation of the usage of each table, here is what I expect : customer : ~5000, shouldn't grow fast data : 5 millions per customer, could double or triple for big customers. tag : ~1000, quite fixed size data_tag : hundred of millions per customer easily. Each data can be tagged a lot. The harvesting process is permanent, that means that around every 15 minutes new data come and are tagged, that require a very constant index refreshing. A lot of my queries are a SELECT COUNT of DATA between specific DATES and tagged with a specific TAG on a specific CUSTOMER (very rarely it will involve several customers). Here is the situation, you can imagine with this kind of volume of data I'm facing a challenge in term of data organization and indexing. Again, it's a very minimalistic and simplified version of my structure. My question is, is it better: to stick with this model and to manage crazy index optimization ? (which involves potentially having billions of rows in the data_tag table) change the schema and use one data table and one data_tag table per customer ? (which involves having 5000 tables on my database) I'm running all of this on a MySQL 5.0 dedicated server (quad-core, 8Go of ram) replicated. I only use InnoDB, I also have another server that run Sphinx. So knowing all of this, I can't wait to hear your opinion about this. Thanks.

    Read the article

  • large RAID 10 vs small RAID1

    - by user116399
    The machine will store and serve millions of small files (<15Kb each), and all those files require a total storage space of 400G Considering the exact same SATA hard drives maker and models, on the exact same environment (OS, cpu, ram, raid controller, etc...) which one of the setups bellow would be faster? A) RAID 1 with 2 drives of 2T each, making up total storage of 2T B) RAID 10 with 4 drives of 2T each, making up total storage of 4T [EDIT]: I'm aware RAID10 is faster than RAID1. The larger the disk, at least in theory, the longer will take to do seeks/writes. So, will the performance gain of RAID10 will be outweighed by the "drag" caused the larger disk area when seek/write operations happened?

    Read the article

  • How to execute a batch file each time a user logins?

    - by user841923
    I've written a batch script which copies of some files in the CommonAppData folder (C:\ProgramData) to the logged in User's Local AppData. What I would like to do is to execute this script for every user every time they login. I found many articles talking about the execution of batch files on startup but I would like to know how to do the same on each login. I've a written a batch file and copied it in : C:\Windows\System32\GroupPolicy\User\Scripts\Logon But it does not seem to be working.

    Read the article

  • IIS large amount of time before loading

    - by Lukes123
    I am running an ASP.NET 3.5 website on IIS 6 with Server 2003. Whenever I modify any of the ASPX files, any page on the site then takes about 2-3 minutes before it starts to load. Even the smallest modification causes this to happen. Why is this?

    Read the article

  • Combine VPN bandwith over two or more WAN connections? Load balancing?

    - by mistrfu
    Imagine you only have DSL with 5mbps Down and 2mbps Up. Is it possible to have 10 of these for example and combine them in a way that would increase the upstrean bandwidth to one server? In my head it works like this: intranet with one gateway/router router connected to multi wan load ballancer on each ballancer wan port router with vpn clinet set up, tunneling to a server ?some? software on the server in cloud joining all these connection into one interface again I would need this mostly for big uploads to a server, downlink to the office is not that important at all. Does it even make sense? I drew an image to clarify.

    Read the article

  • Using Dropbox as a cloud based file share - does everyone need an upgraded account?

    - by aSkywalker
    We have a file share in our small office (3-5 users). We now have the need for the files to be accessible outside of the office. I like the idea of dropbox - we have been using it for small remote sharing. If we buy the upgrade account, and move 30 to 70 gigs of files to it, will every user have to have the pro account? I have submitted this question to dropbox - but thought that the advice of users here would also be valuable

    Read the article

  • Default profile for large

    - by user63434
    Hi I am setting up a master image to clone to all same machine type Windows 7 client, I login as administrastor and installed all the programs and changed the desktop settings etc, but my local administrator profile is 244megs in size, which will become the default profile of the local machine when sysprep, we have a 2003 server that I want to use mandatory profile for all login users which means I need to copy this profile to the server so when any users login to the domain they are using this profile, loading a 244megs profile is going to be very slow since it will be removed from the client when they logoff. So next time they login it will take a long time again. Is there anything I can do, can I just copy just the bare minimum files from the default profile to the server, as I am not sure what parts I need, I read that I must copy my documents, my documents/pictures so the folder redirection will work. What else do I need to copy to the server? I have firefox xmark sync also and MS words etc. THanks

    Read the article

  • SVN Checkout error on large repositories

    - by Brian Mitchell
    I wonder if anyone can help me. We have recently migrated our Subversion repository from a VisualSVN Server on Windows to a subversion server on CentOS. The migration was succesfull however we are getting the following error message Error REPORT of svn'/svn/MangoRepository/!svn/vcc/default': Could not read chunk size: Error connection was closed by server (http://servername) Now the workaround for this is simply to perform a update on the repo and it will contine where is left off. Im just wondering if anyone was a permanent fix for this as it can be quite frustrating to repeat my self to 60-70 developers.

    Read the article

  • Why is execution of batch files different between drag & drop and from command line?

    - by Dharma Leonardi
    Ok, so I've been trying to figure this out for hours with no progress. I have created a batch file to get details of a VHD. Everything runs fine and produces the expected results when run from the command line in a command prompt. However, when I use drag and drop from file explorer (dragging a vhd file and dropping onto the batch file) the batch file runs without errors but the output (VHD.INFO) is empty. I'm stumped. Edited to only include the behaviour: @echo off cls setlocal enabledelayedexpansion set "_PATH.THIS=%~dp0" echo HELP | diskpart > %_PATH.THIS%OUTPUT.TMP TYPE %_PATH.THIS%OUTPUT.TMP PAUSE To demonstrate the different behaviour, please run the batch file from the command line once (works) and also run the batch file by double clicking in file explorer (failure in all piping commands).

    Read the article

  • NOTEPAD++ Need macro or typeitin for automation of large lists

    - by user2526699
    I'm sure there is a way to do this but I can not seem to figure it out. I will try my best to explain this. I have a list with 20,000 lines in notepad++. I have two tabs open in notepad++. The right side tab is the main list. The left side tab is what needs to be added to the beginning of each line in the right tab. Here is an image of my notepad++ to give you a better understanding. I need to be able to do the following in an automated way as I have over 20,000 lines to do this way. copy line 1 of tab 'new 7' switch to tab 'new 6' paste clipboard(line 1 of tab 'new 7') at beginning of line 1 tab 'new 6' switch back to tab 'new 7' copy line 2 of tab 'new 7' switch to tab 'new 6' paste clipboard(line 2 of tab 'new 7') at beginning of line 2 tab 'new 6' I have both pasteitin and typeitin download but if i need some other program/app or if it's built in to notepad++ that would be great. I need to do this by the program itself or for me to only have to press a button to do each of these.

    Read the article

  • Large recovery partitions

    - by Unsigned
    Is there any good reason as to why factory restore partitions are generally much larger than they need to be? Examples I have found in my own experience: Dell XPS laptop Partition: 13.67 GB Used: 6.68 GB Dell Inspiron laptop Partition: 14.7 GB Used: 7.2 GB Toshiba laptop Partition: 15.3 GB Used: 9 GB In all cases, shrinking the partition to only slightly more than the Used space had no ill effects on future factory restorations. Why the exorbitant amount of extra space, given that neither of the three computers ever writes any data to the recovery partition? Is there a good reason I'm overlooking?

    Read the article

  • How do I get SSIS Data Flow to put '0.00' in a flat file?

    - by theog
    I have an SSIS package with a Data Flow that takes an ADO.NET data source (just a small table), executes a select * query, and outputs the query results to a flat file (I've also tried just pulling the whole table and not using a SQL select). The problem is that the data source pulls a column that is a Money datatype, and if the value is not zero, it comes into the text flat file just fine (like '123.45'), but when the value is zero, it shows up in the destination flat file as '.00'. I need to know how to get the leading zero back into the flat file. I've tried various datatypes for the output (in the Flat File Connection Manager), including currency and string, but this seems to have no effect. I've tried a case statement in my select, like this: CASE WHEN columnValue = 0 THEN '0.00' ELSE columnValue END (still results in '.00') I've tried variations on that like this: CASE WHEN columnValue = 0 THEN convert(decimal(12,2), '0.00') ELSE convert(decimal(12,2), columnValue) END (Still results in '.00') and: CASE WHEN columnValue = 0 THEN convert(money, '0.00') ELSE convert(money, columnValue) END (results in '.0000000000000000000') This silly little issue is killin' me. Can anybody tell me how to get a zero Money datatype database value into a flat file as '0.00'?

    Read the article

  • xcodebuild: error: 'file' is not a workspace file

    - by Vladimir Voitekhovski
    there were no problem but now my job in Jenkins CI failed. I try to rename Xcode workspace like RacingPost.xcworkspace no changes. Try to delete this options at all, this job also failed. Configuration: XCODEPROJECTDIRECTORY=. TARGET_BUILD_DIR=${XCODEPROJECTDIRECTORY}/build XCODEWORKSPACE=RacingPost.xcodeproj XCODESCHEME=UnitTests XCODECONFIGURATION=ENTERPRISE-HD XCODESDK=iphoneos XCODEARGS="TEST_AFTER_BUILD=YES" XCODEBUILD_APP_NAME=RacingPost.app XCODETARGET=UnitTests XCODEPROJECT=RacingPost.xcodeproj IPA_PATH=${TARGET_BUILD_DIR}/RacingPost.ipa xcrun -sdk iphoneos PackageApplication -v "$(${XCTOOL_HOME}/xctool.sh -scheme RacingPost -project ${XCODEPROJECT} -configuration "${XCODECONFIGURATION}" -sdk "${XCODESDK}" -showBuildSettings -workspace ${WORKSPACE}| grep TARGET_BUILD_DIR | cut -d = -f 2 | cut -d . -f 1 | head -1 | sed 's/^[ ^t]*//')/${XCODEBUILD_APP_NAME}" -o "${IPA_PATH}" Output: 17:33:25 ** BUILD SUCCEEDED ** (58104 ms) 17:33:27 [RGP-ODC_RacingPost_iPad_staging] $ /bin/bash -xe /var/folders/df/575wx61n4dzdlw_48pgsjwk40000gn/T/hudson6052564280091633098.sh 17:33:27 + cd . 17:33:27 ++ /Users/epadmin/ci-tools/xctool/xctool.sh -scheme RacingPost -project RacingPost.xcodeproj - configuration ENTERPRISE-HD -sdk iphoneos -showBuildSettings -workspace /Users/epadmin/jenkins-slave/workspace/RGP- ODC_RacingPost_iPad_staging 17:33:27 ++ grep TARGET_BUILD_DIR 17:33:27 ++ cut -d = -f 2 17:33:27 ++ cut -d . -f 1 17:33:27 ++ head -1 17:33:27 ++ sed 's/^[ ^t]*//' 17:33:30 xcodebuild: error: '/Users/epadmin/jenkins-slave/workspace/RGP-ODC_RacingPost_iPad_staging' is not a workspace file. 17:33:30 + xcrun -sdk iphoneos PackageApplication -v /RacingPost.app -o ./build/RacingPost.ipa 17:33:30 error: Specified application doesn't exist or isn't a bundle directory : '/RacingPost.app' 17:33:30 Build step 'Execute shell' marked build as failure My structure of folders on node: epadmin@epclus1macp02:~/jenkins-slave/workspace/RGP-ODC_RacingPost_iPad_staging$ ls drwxr-xr-x 21 epadmin staff 714B May 29 10:28 ./ drwxr-xr-x 37 epadmin staff 1.2K May 29 10:33 ../ drwxr-xr-x 7 epadmin staff 238B May 29 10:28 .svn/ drwxr-xr-x 10 epadmin staff 340B Apr 18 09:02 RacingPost/ drwxr-xr-x 10 epadmin staff 340B May 29 09:27 RacingPost.xcodeproj/ drwxr-xr-x 10 epadmin staff 340B May 7 09:04 RacingPostUtilApp/ drwxr-xr-x 3 epadmin staff 102B May 18 02:16 Source/ drwxr-xr-x 75 epadmin staff 2.5K Apr 16 09:03 UnitTests/ drwxr-xr-x 16 epadmin staff 544B May 29 09:05 _certs/ drwxr-xr-x 3 epadmin staff 102B Apr 15 09:03 _doc/ drwxr-xr-x 4 epadmin staff 136B Apr 15 09:03 _provisioning/ drwxr-xr-x 3 epadmin staff 102B May 29 10:31 build/ -rw-r--r-- 1 epadmin staff 9.9K May 19 04:39 build.xml

    Read the article

  • c#: how to read parts of a file? (DICOM)

    - by Xaisoft
    I would like to read a DICOM file in C#. I don't want to do anything fancy, I just for now would like to know how to read in the elements, but first I would actually like to know how to read the header to see if is a valid DICOM file. It consists of Binary Data Elements. The first 128 bytes are unused (set to zero), followed by the string 'DICM'. This is followed by header information, which is organized into groups. A sample DICOM header First 128 bytes: unused DICOM format. Followed by the characters 'D','I','C','M' Followed by extra header information such as: 0002,0000, File Meta Elements Groups Len: 132 0002,0001, File Meta Info Version: 256 0002,0010, Transfer Syntax UID: 1.2.840.10008.1.2.1. 0008,0000, Identifying Group Length: 152 0008,0060, Modality: MR 0008,0070, Manufacturer: MRIcro In the above example, the header is organized into groups. The group 0002 hex is the file meta information group which contains 3 elements: one defines the group length, one stores the file version and the their stores the transfer syntax. Questions How to I read the header file and verify if it is a DICOM file by checking for the 'D','I','C','M' characters after the 128 byte preamble? How do I continue to parse the file reading the other parts of the data?

    Read the article

  • Why are my attempts to open a file using open for writing failing? Ada 95

    - by mat_geek
    When I attempt to open a file to write to I get an Ada.IO_Exceptions.Name_Error. The procedure call is Ada.Text_IO.Open The file name is "C:\CC_TEST_LOG.TXT". This file does not exist. This is on Windows XP on an NTFS partition. The user has permissions to create and write to the directory. The filename is well under the WIN32 max path length. name_2 : String := "C:\CC_TEST_LOG.TXT" if name_2'last > name_2'first then begin Ada.Text_IO.Open(file, Ada.Text_IO.Out_File, name_2); Ada.Text_IO.Put_Line( "CC_Test_Utils: LogFile: ERROR: Open, File " & name_2); return; exception when The_Error : others => Ada.Text_IO.Put_Line( "CC_Test_Utils: LogFile: ERROR: Open Failed; " & Ada.Exceptions.Exception_Name(The_Error) & ", File " & name_2); end; end if;

    Read the article

  • How to generate Thumbnails with phpThumb and save it into a file?

    - by CuSS
    Hi all, i wan't to know how to generate thumbnails with phpThumb Class, with an array with file paths already. And then Save the result of each image on other path but with the same name. Thank you all ;) EDIT: I have my code something like this: echo "A iniciar gerador de miniaturas para a área de cliente: \n"; include $wincli['files']['phpThumbClass']; $files = file_list($wincli['dirs']['logos']); $phpThumb = new phpThumb(); foreach( $files as $file ) { echo " # A converter o ficheiro '".basename($file)."' : "; if(is_file($file)){ $phpThumb->setSourceFilename($file); $phpThumb->setParameter('w', 880); $phpThumb->setParameter('h', 241); $phpThumb->setParameter('q', 90); $phpThumb->setParameter('zc', 1); $outputFilename = $wincli['dirs']['logosthumbs'].$file; if($phpThumb->GenerateThumbnail()){ if($phpThumb->RenderToFile($outputFilename)){ echo "OK \n"; }else{ echo "Falhou (Ao guardar no ficheiro)\n"; } }else{ echo "Falhou (Ao gerar miniatura)\n"; } }else{ echo "Falhou (Ficheiro inexistente)\n"; } }

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >