Search Results

Search found 54055 results on 2163 pages for 'multiple files'.

Page 208/2163 | < Previous Page | 204 205 206 207 208 209 210 211 212 213 214 215  | Next Page >

  • multiple count Pivot table in Excel

    - by Sivakanesh
    Hi all, I'm trying to put togeter a pivot table from an Excel spreadsheet. The spreadsheets look similar to the following: DeptHead, Emp, Increment x, A, 2.5% x, B, y, C, 1.5% y, D, y, E, 2.0% I would like to make a pivot table that looks like the following; DeptHead, CountOfEmp, CountOfIncrement x, 2, 1 y, 3, 2 So it provides a count of total number of Emps and total number Increments for each DeptHead ignoring the blanks. I have tried to do this in many ways in Pivot table, but the two counts are only appearing in rows and not in columns as above. Is there any way to achieve this please? Thanks

    Read the article

  • Restart of Master Postgres DB with unconsumed Wal files

    - by Douglas Sellers
    We have a situation where walmanager is being used to ship wal files between a master and a slave Postgres database. The slave machine has failed and has had to have been rebuilt. This has caused a lot of unconsumed wal files to build up on the master. If a reboot is issued to the Postgres master, and there are 24 hours worth of unconsumed wal files hanging around, will the master be effected at all or will it start clean?

    Read the article

  • Mercurial Messing Up csproj Files?

    - by alphadogg
    I am using Hg to manage and merge code with three other developers involved in a VS2008 project. We do have an .hgignore file that ignores a fair number of files not necessary to track, such as *.pdb, *.obj, etc. However, we do track .csproj files. Periodically, it would seem that files go missing after a merge. We would get build issues, and have to relocate files which were in the project folders, but not in the csproj file. Eventually, I noted during a merge conflict that sometimes Hg seems to merge incorrectly. Here's a screenshot below. The actual conflict that requires manual intervention is lower in the file. But in this section, hg incorrectly replaces DirectoryTasks.cs with a new, different file called ReportTasks.cs, when in fact, both should be added. How do people manage to avoid this?

    Read the article

  • Vantec NexStar NAS Encloser - Writing large files

    - by peter
    Hi, I have one of these 'Vantec NexStar LX - NST-475LX-BK' drive enclosures. It is a NAS drive. When I write a file to the device using eSata, or a SMB share I cannot write files over 2GB. I think this is because the drive is formatted with FAT32. But when I access the device using FTP it doesn't matter. I can write files of any size. E.g. I wrote one on there last night which was 30GB. Does this make any sense? Why? I guess the most important thing for me is data integrity.

    Read the article

  • How to compare/merge between XIB files?

    - by sasayins
    Hi guyz, I have these two XIB files. The first one edited by my friend to add features and to other one edited by my self and add another features. My problem is how can i merge the two files? I know that XIB files are XML based and I can use some compare tools to merge it. But I think there will be some conflicts. What is the best way to compare or merge between XIB files? Thanks a lot guys. sasayins

    Read the article

  • Use a folder of xml files as data source for nhibernate

    - by Bart Van Eyndhoven
    I'm going to start writing NUnit tests for a few classes in my project. A certain number of these classes use data gathered through nhibernate from a sql server 2008 database. The part of the program I'm about to test is very specific (and complicated). Therefore I have made a folder of xml files. Combined, the xml files could result in the database structure. I mean each xml file corresponds to a table in the database. The data in the xml files is also consistent with the database. Is there a way to use this folder of xml files as data source for nhibernate? I mean: can I use nhibernate to gather my test data (wich I have specifically chosen) instead of data from the database? In this way, I could usefully test this component without corrrupting the (test) database for future tests.

    Read the article

  • How to update application files using patching?

    - by Marek
    I am not interested in any auto update solution, such as ClickOnce or the MS Updater Block. For anyone feeling the urge to ask why not: I am already using these and there is nothing wrong with them, I would just like to learn about any efficient alternatives. I would like to publish patches = small differences that will modify existing files of the deployment with the smallest possible delta. Not only code needs to be patched, but also resource files. Patching the running code can be accomplished by maintaining two separate synchronized copies of the deployment (no on the fly changes to the running executable are required). The application itself can be xcopy deployed (to avoid MSI auto-correcting the modified files or breaking ClickOnce signatures). I would like to learn how to handle different versions of patches (e.g. there is a patch issued that fixes one error and later another patch that fixes another error (in the same file) - users may have any combination of these and there comes a third patch - in text files, this may be easy to implement, but how about executable files? (native Win32 code vs. .NET, any difference?) If the first problem is too hard to solve or unsolvable for executables, I would like to at least learn if there is a solution that implements simple patching with serial revisions - in order to install revision 5, user must have all previous revisions installed to ensure validity of the deployment. Are there any existing solutions to accomplish this? NOTE: There are a few questions on SO that may seem like duplicates, but none with a good answer. This question is about the Windows platform, preferably .NET.

    Read the article

  • MySQL replicate multiple places

    - by Frederik Nielsen
    Very trick task to find a good title for this question, but here goes the q: I have a few development machines, where I develop my PHP applications on, and testing via a local webserver. This works out pretty well for each machine. However, I would like to replicate the DB from my machines to a central location. So, to sum up: DEV1 - CENTRAL DEV2 - CENTRAL DEV3 - CENTRAL CENTRAL - DEV1 CENTRAL - DEV2 CENTRAL - DEV3 I hope this makes sense, as I cannot find an easy way to tell it. Basically, it is a 2-way replication, where all 4 databases contain the same info, and each of them can be updated locally, to then be pushed out to the others. Is this actually doable? All my dev machines are running Windows 7, and my central DB server is running CentOS 6.

    Read the article

  • Handle multiple db updates from c# in SQL Server 2008

    - by joeriks
    I like to find a way to handle multiple updates to a sql db (with one singe db roundtrip). I read about table-valued parameters in SQL Server 2008 http://www.codeproject.com/KB/database/TableValueParameters.aspx which seems really useful. But it seems I need to create both a stored procedure and a table type to use it. Is that true? Perhaps due to security? I would like to run a text query simply like this: var sql = "INSERT INTO Note (UserId, note) SELECT * FROM @myDataTable"; var myDataTable = ... some System.Data.DataTable ... var cmd = new System.Data.SqlClient.SqlCommand(sql, conn); var param = cmd.Parameters.Add("@myDataTable", System.Data.SqlDbType.Structured); param.Value=myDataTable; cmd.ExecuteNonQuery(); So A) do I have to create both a stored procedure and a table type to use TVP's? and B) what alternative method is recommended to send multiple updates (and inserts) to SQL Server?

    Read the article

  • Exchange 2013 - DNS Records for Accepting Multiple Domains

    - by William
    I have an Exchange 2013 server accepting two domains: domain1.com and domain2.com. All of the exchange services (OWA, ECP, POP3, SMTP, etc.) can be found via the address mail.domain1.com. So, in the DNS records for domain1, I have the following entries: MX Record mail.domain1.com A Record mail.domain1.com - (IP Address of Server) CNAME Record autodiscover.domain1.com - mail.domain1.com Now, for domain2.com, how would I set up the DNS records? Would I have the autodiscover just be a cname for autodiscover.domain1.com? Would this allow me to leverage the certificates that I have installed for domain1?

    Read the article

  • Bug: files uploaded via desktop or web client have hidden tag when listed via API

    - by Jon Webb
    Files uploaded to Google Drive sometimes incorrectly have a hidden tag when listed via the Document List v3 REST API: <category scheme='http://schemas.google.com/g/2005/labels' term='http://schemas.google.com/g/2005/labels#hidden' label='hidden'/> This happens if: a subfolder is created via the Google Drive desktop client and files are copied in, or a folder is uploaded via the Google Drive web client. The folder does not have the hidden tag, but the files that were uploaded do. The files do not have this tag if: they are individually uploaded via the Google Drive web client to the subfolder, or they are uploaded via the REST API to the subfolder, or they are uploaded via the desktop client to the My Drive root. The files and folders show up in Google Drive whether they have the hidden tag or not. We're using the API with the following scope: https://docs.google.com/feeds/ https://spreadsheets.google.com/feeds/ https://docs.googleusercontent.com/ I have verified and can recreate this with the OAuth 2.0 playground. Google Drive desktop client version 1.3.3209.2600 on Win7 32-bit I guess these must be bugs in the API...

    Read the article

  • NFS access from multiple networks

    - by Luke
    On my NFS server (Ubuntu) I have in /etc/exports the following: /share 192.168.89.1/24(rw,no_root_squash,async) However, I have a new machine which is not in 192.168.89.* IP range, it's in 192.168.92.* instead. How can I make this machine access my NFS server?

    Read the article

  • Issue with multiple bridging for KVM hosts

    - by Henry-Nicolas Tourneur
    I'm using KVM and libvirt on my host (Debian lenny) + 2 bridges per guest (one for mgmt, one for public traffic). That setup isn't stable at all, sometimes I can do pings to a management ip, sometimes not. I don't know if my bridging paramateres are correct, could you check ? or if there is anything wrong ... Please also note that interface on guest doesn't flap and that I got not logs on my host. Of course forwarding is enabled. iface eth3 inet manual auto bond0 iface bond0 inet manual slaves eth1 eth2 pre-up ip link set bond0 up down ip link set bond0 down auto br0 iface br0 inet static address 10.160.0.7 netmask 255.255.255.128 bridge_ports eth3 bridge_fd 9 bridge_hello 2 bridge_maxage 12 bridge_stp off auto br0:1 iface br0:1 inet static address 10.160.0.9 netmask 255.255.255.128 auto br0:2 iface br0:2 inet static address 10.160.0.10 netmask 255.255.255.128 auto br1 iface br1 inet static address 217.4.40.242 netmask 255.255.255.240 gateway 217.4.40.241 pre-up /etc/network/firewall start bridge_ports bond0 bridge_fd 9 bridge_hello 2 bridge_maxage 12 bridge_stp off auto br1:1 iface br1:1 inet static address 217.4.40.252 netmask 255.255.255.240 auto br1:2 iface br1:2 inet static address 217.4.40.253 netmask 255.255.255.240

    Read the article

  • Problem with script that excludes large files using Duplicity and Amazon S3

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. If i run the script duplicity gives an error. However if i copy and paste the same command generated by the script everything works... Here is the script #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="gpgkey" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= When the script is run I get the error; Command line error: Expected 2 args, got 6 Where am i going wrong??

    Read the article

  • Backup script that excludes large files using Duplicity and Amazon S3

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. My script gives the proper command, but when run within the script it outputs an an error. However if the same command is run manually everything works...??? Here is the script based on one easy found with google #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="7743E14E" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= If run I recieve the error; Command line error: Expected 2 args, got 6 Enter 'duplicity --help' for help screen. Any help your could offer would be greatly appreciated.

    Read the article

  • Copy files between two windows machines on seperate domains

    - by Simon
    I need to copy several database backups between two computers. The source computer initiates the copy and is a Windows 2000 pc and is a member of domain1. The destination machine is running Windows Server 2000 and is a member of domain2. The machines are on separate networks physically connected via a firewall. The files are currently copied via ssh with http://sshwindows.sourceforge.net/ installed on the destination machine. There is no need to encrypt the contents during the copy, however the passwords should not be sent in the clear. I am looking for a way to copy the files without having to install a server on the destination. I specifically need help with how to set up the permissions and what ports would need to be opened on the firewall.

    Read the article

  • Multiple elements with the same name with SimpleXML and Java

    - by LouieGeetoo
    I'm trying to use SimpleXML to parse an XML document (an ItemLookupResponse for a book from the Amazon Product Advertising API) which contains the following element: <ItemAttributes> <Author>Shane Conder</Author> <Author>Lauren Darcey</Author> <Manufacturer>Pearson Educacion</Manufacturer> <ProductGroup>Book</ProductGroup> <Title>Android Wireless Application Development: Barnes & Noble Special Edition</Title> </ItemAttributes> My problem is that I don't know how to deal with the multiple possible Author elements. Here's what I have right now for the corresponding POJO (Plain Old Java Object), keeping in mind that it's not handling the case of multiple Authors: @Element public class ItemAttributes { @Element public String Author; @Element public String Manufacturer; @Element public String Title; } (I don't care about the ProductGroup, so it's not in the class -- I'm just setting SimpleXML's strict mode to off to allow for that.) I couldn't find an example in the documentation that corresponded with such a case. Using an ElementList with (inline=true) seemed along the right lines, but I didn't see how to do it for String (as opposed to a separate Author class, which I have no need for and don't see how it would even work). Here's a similar question and answer, but for PHP: php - simpleXML how to access a specific element with the same name as others? I don't know what the Java equivalent would be to the accepted answer. Thanks in advance.

    Read the article

  • Alias multiple DNS entries to one Amazon S3 Bucket

    - by Tristan
    I have a bucket on Amazon S3. Lets call it "webstatic.mydomain.com". I have a DNS alias setup for that bucket webstatic.mydomain.com CNAME - web-static.mydomain.com.s3.amazonaws.com. This all works great, however for some rather complicated reasons I now need: webstatic.myOtherDomain.com to point to that same amazon bucket so: webstatic.myOtherDomain.com CNAME - web-static.mydomain.com.s3.amazonaws.com. Fails, as the bucket is not called the same as the referring DNS. Can anyone tell me how to have two different DNS entries pointing to the same amazon bucket?

    Read the article

  • checkout files simultaneously in perforce

    - by sap
    In CVS we cant check out the files simultaneously. Can anybody tell ....whether in perforce can we checkout files simultaneously? And i heard about locks on the files. Before merging the file do we need to have a lock on them in perforce?

    Read the article

  • low performance on HPC cluster (sge) when running multiple jobs

    - by Yotam
    O know this is a long-shot but I'm clueless here. I'm running several computer simulations on High Performance Computation cluster (HPC) of oracale grid engine (sge). A single job runs at a certain speed (roughly 80 steps per second) when I add jobs to the machine, at a certain treshhold, the speed is recuded by two. On one machine (I don't know the cpu kind) the treshold is 11 jobs for 16 cpu's. On another one with the same number and kind of cpu's , the treshold is 8. I thought at first that this is a memory issue but each job takes about 60MB - 100MB and I have 16GB of ram on each of those machine. Did any of you encountered such a problem? is there any way to analyz this? Thanks.

    Read the article

  • Multiple "My Documents" folders?

    - by Flasimbufasa
    I am having the issue, where, I have 2 folders called "My Documents". I recently edited my registry to make "Document" link in the Windows 7 start menu be a FOLDER link and not a LIBRARY link.. Here is the Registry Key information for the Documents key: Key Name: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Explorer\FolderDescriptions\{7b0db17d-9cd2-4a93-9733-46cc89022e7c} Class Name: <NO CLASS> Last Write Time: 3/2/2011 - 2:33 AM Value 0 Name: Attributes Type: REG_DWORD Data: 0x1 Value 1 Name: Category Type: REG_DWORD Data: 0x4 Value 2 Name: Icon Type: REG_EXPAND_SZ Data: %SystemRoot%\system32\imageres.dll,-1002 Value 3 Name: LocalizedName Type: REG_EXPAND_SZ Data: @%SystemRoot%\system32\shell32.dll,-34575 Value 4 Name: Name Type: REG_SZ Data: Documents Value 5 Name: PublishExpandPath Type: REG_DWORD Data: 0x1 Value 6 Name: PrecCreate Type: REG_DWORD Data: 0x1 Value 7 Name: RelativePath Type: REG_SZ Data: Documents Value 8 Name: Roamable Type: REG_DWORD Data: 0x1 Also, Navigating through computer to "C:\Users\Flasimbufasa\" Only shows one folder called "Documents" However, whenever I navigate to user profile from "Desktop\Flasimbufasa" I get 2 Document folders. Any help?

    Read the article

  • Pin same app multiple times in Windows 7

    - by Mr. Shiny and New
    I use some programs with command line arguments and like to have shortcuts for launching those programs with those arguments. For example, I keep several Firefox profiles around and like to specify the profile name on the command line. Similarly I have several Eclipse shortcuts with a command line argument specifying the workspace to open. I would like to be able to pin these shortcuts to the start menu or taskbar in Windows 7. The problem I have is that once I've pinned one of these, no other shortcuts which launch the same exe can be launched. I'm also open to suggestions such as a suitable desktop gadget which can contain a bunch of arbitrary shortcuts, yet remain in a fixed position on my desktop somewhere, or some way of adding a secondary taskbar (this was possible in XP).

    Read the article

  • Ctrl-M chars when transfer files SFTP

    - by eve
    Hi, I am sending files from a windows system to a Unix SFTP server using JSCAPE ftp client. However, I am experiencing the following issue: When uploading a text file from windows to UNiX, each line of text files transferred contains Control-M characters. I did some search and found out that If I use the "ASCII" transfer mode it should solve the issue. But the Ctrl-M is still appearing on the files. Can anyone throw some light in this issue? thanks in advance

    Read the article

  • Which web browser support running multiple sessions simultaneously?

    - by leonbnu
    basically, I want to group my tabs by categories and then split them to different sessios(winows). For example, I can have one windows which contains all the tabs that are work related, and some other windows that are for news, everyday use, hobbies, entertainment, etc. So I can load these sessions/windows independently whenever I want. More importanly, I need the browser to be able to manage these sessions separately, for example, when I close one window(session), it should be able to autosave it without affecting any other sessions. I think this requirement is quite simple. But somehow, neither firefox(with session manager addon) nor opera support this. so which browser actually supports this?

    Read the article

< Previous Page | 204 205 206 207 208 209 210 211 212 213 214 215  | Next Page >