Search Results

Search found 4461 results on 179 pages for 'duplicate removal'.

Page 23/179 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Xcode - duplicate Target - new Target fails to build

    - by SirRatty
    Hi all, using Xcode 3.2.5 on 10.6.6 (10J521) I have an Xcode project containing 1 Target: "MyApp". It builds and runs successfully. As well as source and resource files, the Target contains a "Copy Files" build phase which copies "Sparkle.framework" in. The framework is in the same directory as the project. I want to duplicate this Target. Steps taken: Did "Clean all Targets". Right-clicked on the "MyApp" Target within Xcode, and then chose "Duplicate". Renamed the duplicated target to "MyAppTarget2". Selected "MyAppTarget2" as the Active Target from the popup menu in the top-left. Did "Build". The problem: error: Sparkle/Sparkle.h: No such file or directory This is puzzling! Each Build step appears to have been replicated in the duplicated Target, including the "Copy Files" phase. The Sparkle.framework exists at the path indicated by [Get Info on the Copy Phase item]. If I right-click on the Sparkle.framework file within the "Copy Files" build phase of the duplicated Target, and select "Reveal in Finder", then the correct Sparkle.framework file is shown. The required file exists at Sparkle.framework/Headers/Sparkle.h If I switch back to the original "MyApp" target, it builds and runs successfully. Am I doing something obviously wrong here? Thanks.

    Read the article

  • Upgraded to Xcode 4 -- Endless stream of duplicate symbol errors causing build errors

    - by D-Nice
    Everything was working perfectly fine in Xcode 3 yesterday before I upgraded. So I completed the upgrade, restarted my computer, and opened my old project. I had to reconfigure a few settings like the header paths so that I could begin to compile. I'm using AdWhirl for ad mediation, and at this point my errors begin to read something like duplicate symbol _OBJC_METACLASS_$_SBJSON in /Users/Admin/Desktop/TMapLiteAdwhirl/AdWhirl/MMSDK/libMMSDK.a(SBJSON.o) and /Users/Admin/Library/Developer/Xcode/DerivedData/TruxMapLite-bgpylibztethnlhkfkdumpvrjvgy/Build/Intermediates/TruxMapLite.build/Debug-iphoneos/TruxMapLite.build/Objects-normal/armv6/SBJSON.o for architecture armv6 The library it's referring to is the SDK for one of the ad networks I'm including in AdWhirl. Both of the 'duplicate symbols' refer to the SAME FILE, but they use different paths. If I had still had XCode 3, I would simply try excluding these libraries from the build path, but I have no idea how that can be done in Xcode 4. I've tried everything all the way down to deleting the library and all associated files from my project, but when I do this, i will simply get the same type of error for a different library in the AdWhirl directory. This is incredibly frustrating because before my upgrade everything was working smoothly and I was prepared to submit my binary. If anyone has any advice, id be more than happy to give it a try. Thanks!

    Read the article

  • Use MySQL trigger to update another table when duplicate key found

    - by Jason
    Been scratching my head on this one, hoping one of you kind people and direct me towards solving this problem. I have a mysql table of customers, it contains a lot of data, but for the purpose of this question, we only need to worry about 4 columns 'ID', 'Firstname', 'Lastname', 'Postcode' Problem is, the table contains a lot of duplicated customers. A new table is being created where each customer is unique and for us, we decide a unique customer is based on 'Firstname', 'Lastname' and 'Postcode' However, (this is the important bit) we need to ensure each new "unique" customer record also can be matched to the original multiple entries of that customer in the original table. I believe the best way to do this is to have a third table, that has 'NewUniqueID', 'OldCustomerID'. So we can search this table for 'NewUniqueID' = '123' and it would return multiple 'OldCustomerID' values where appropriate. I am hoping to make this work using a trigger and the on duplicate key syntax. So what would happen is as follows: An query is run taking the old customer table and inserting it in to the new unique table. (A standard Insert Select query) On duplicate key continue adding records, but add one entry in to the third table noting the 'NewUniqueID' that duped along with the 'OldCustomerID' of the record we were trying to insert. Hope this makes sense, my apologies if it isn't clear. I welcome and appreciate any thoughts on this one! Many thanks Jason

    Read the article

  • Removing duplicate images (deduplication) - calculating "overlap" of images

    - by jotango
    Hello, I have a ton of product images on our file system. Our code removes 100% identical images (or does not allow them to be uploaded). However our sellers often upload items pictures which are very similar, but not exactly. They could have more whitespace, a worse quality (compression), a different size etc. Is there any way I can calculate the degree of overlap between two images, to flag ones for deletion? Kind of like a Levenshtein distance between two images... Any pointers would be very cool. Thanks!

    Read the article

  • Configuring RAID1 on HP Proliant Microserver N54L / Ubuntu 14.04.1 LTS [duplicate]

    - by Chris Beach
    This question already has an answer here: Cant find my harddrives in ubuntu installation? 2 answers I've bought a N54L and fitted two 3GB drives. Keen to get set up with RAID1. BIOS: SATA controller mode set to "RAID" RAID Option ROM utility: both physical drives set up as one logical drive When I came to install Ubuntu (14.04.1), both drives appeared during the setup process. I was only expecting to see the logical drive, although I'm a complete novice with RAID. I've read that the HP Proliant Microservers don't have "proper" RAID support, and require some kind of driver to be installed. I've tried a few HP utilities from the following apt repo: deb http://downloads.linux.hp.com/SDR/repo/mcp wheezy/current non-free On installation, most say "server not supported" Would appreciate your advice.

    Read the article

  • Linux accessibility: Slow Keys causing duplicate key strokes

    - by skypanther
    I'm exploring the accessibility features within Gnome and having trouble with Slow Keys. My input is always doubled. Press a key briefly and I get nothing as you'd expect. Press just a bit longer and which ever key I'm pressing is input twice. Hello becomes HHeelllloo. I'm running Debian Lenny 5.0.6, kernel 2.6.26-2-686, GNOME Desktop 2.22.3 running within a VirtualBox session. I did some googling and didn't find others having similar troubles. Maybe it's a vbox thing? Any ideas how to fix this so I don't get the duplicates? It makes it impossible to log back in when the screen lock kicks on!

    Read the article

  • Duplicate of Certificate Templates does not appear in Certificate Template to Issue

    - by Sean
    I'm following what should be simple instructions to enable LDAP SSL on our domain controller (instructions here). Duplicating the Kerberos certificate is successful however, when attempting to select "Certificate Template to Issue", the created certificate does not appear. What gives? A long time ago, I actually completed this step on a now decommissioned DC with no problem. Our environment is Windows Server 2008 Standard, and we have two domain controllers. Only one has the role of certificate authority. I look forward to any help here, thank you ahead of time.

    Read the article

  • Is there a diff utility that allows you to exclude columns?

    - by user39160
    For example, I have a text file, each line is a long string. I want to exclude 2 "segments" of this string, say columns 1-7 and 20-22. So the bottom 2 lines below would be a match: 123456789012345678901234567890 ------------------------------ xxxxxxxAAAAAAAAAAAAxxxBBBBBBBB yyyyyyyAAAAAAAAAAAAyyyBBBBBBBB I know WinMerge has a "IgnoreColumns" plugin but I have never go this working. In this example I would rename it IgnoreColumns_1-7, 20-22.dll, select it in the plugins menu, and choose "Pre-Differ." but it has never worked. I am going to be comparing huge files that I don't want to modify. I'm not opposed to stream editing them in the comparison with sed or something like that, but I would prefer not to modify the actual files. I have not chose to feed sed to diff yet just because I was hoping for for a more visual view of the data.

    Read the article

  • How to prevent Outlook from receiving multiple copies of the same email

    - by martani_net
    This question might be asked already, but it's different from this one. My boss has Outlook 2003, when he synchronize his emails while connecting through the local server (using Exchange I guess) he gets his emails normally. Once he is outside (not connection to our LAN) he gets each email duplicated 3 or 4 times. Does anyone experienced this before? and how can we fix this? Please no links to FAQ pages. For info, we are using Kaspersky anti virus and Windows 2003 server, with windows XP clients. [Update] Actually we have a bunch of 5 or 6 email accounts on Outlook, and only one of them recieve duplicated copies of the same email, all the others are cool. Further more all these email are using the sae service, Gmail for example. [Update 2] I just found out that Outlook is configured to remove emails from the server already, also, some emails exceeds 5 copies! Thank you

    Read the article

  • Removing Duplicate entries in grub2 Ubuntu 9.10

    - by Anders
    I have made a custom grub2 menu however, both the default and the custom show together. So my grub looks like the list below, the bolded entries are my custom ones. How do I get rid of the duplicates? I have tried apt-get remove and deleting old kernels. I am a bit lost. Thanks! in Advance. ubuntu,linux ... ubuntu,linux recovery memtest memtest windows7 windows7 ubuntu linux ubuntu linux recover I have tried apt-get remove I have tried marking and removing older kernels. This is how I made my custom grub by the way. I copied and pasted the grub.cfg menuentry code into the custom one and just renamed the titles so it would be perfectly clear for the user who doesn't want to know what version # it is.

    Read the article

  • Point dns server to root dns servers [duplicate]

    - by Dhaksh
    This question already has an answer here: What is a glue record? 3 answers Why does DNS work the way it does? 4 answers I have setup a custom authoritative only DNS server using bind9. Its a Master ans Slave method. Assume DNS Servers are: ns1.customdnsserver.com [192.168.91.129] ==> Master ns2.customdnsserver.com [192.168.91.130] ==> Slave Now i will host few shared hosting websites in my own web server. Where i will link above Nameservers to my domains in shared hosting. My Question is: How do i tell root DNS servers about my own authoritative only DNS server? So that when someone queries for domain www.example.com and if the domain's website is hosted in my shared hosting i want root servers to point the query to my own DNS Server so that the www.example.com get resolved for IP address.

    Read the article

  • Subnet mask and how to resolve IP range [duplicate]

    - by user2789433
    This question already has an answer here: How does IPv4 Subnetting Work? 5 answers If you click on this link WHO IS query You will see the results from a who is query for a random IP address. On the line "inetnum": "122.4.0.0/14", What does it mean for the prefix to be 14? I am using this as a reference Subnet Mask Cheatsheet While resolving it, I get a very wide range which is something like 122.7.0.0-122.4.0.0 I am not able to understand how the IP range is calculate from the subnet mask and I am only able to find calculators online and not a method to solve it.

    Read the article

  • Squirrelmail receiving duplicate emails

    - by Austin
    A client of mine is experiencing issues with his email, it appears that whenever he receives email from a certain domain it comes as duplicates. Not only are they duplicates but the duplicated items have a (+) sign next to them which usually indicates an attachment. Could this be because of a forwarding issue? Here are the headers: Return-Path: <[email protected]> Received: from bigcat.centralmasswebdesign.com (root@localhost) by tarbellconstruction.com (8.13.1/8.13.1) with ESMTP id o4OFnO23003379 for <[email protected]>; Mon, 24 May 2010 11:49:24 -0400 X-ClientAddr: 72.249.26.200 Received: from mf3.spamfiltering.com (mf3.spamfiltering.com [72.249.26.200]) by bigcat.centralmasswebdesign.com (8.13.1/8.13.1) with ESMTP id o4OFnOjF005520 for <[email protected]>; Mon, 24 May 2010 11:49:24 -0400 X-Envelope-From: [email protected] X-Envelope-To: [email protected] Received: From 67-132-16-226.dia.static.qwest.net (67.132.16.226) by mf3.spamfiltering.com (MAILFOUNDRY) id 6lzIAmdLEd+oFQAw for [email protected]; Mon, 24 May 2010 15:49:23 -0000 (GMT) Received: from mail pickup service by WMA2-EXCH1.NELCO-USA.net with Microsoft SMTPSVC; Mon, 24 May 2010 11:49:18 -0400 Content-Transfer-Encoding: 7bit Importance: normal Priority: normal X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.4325 Content-Class: urn:content-classes:message MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="----_=_NextPart_001_01CAFB58.AAB268D0" Subject: weekly activity report for week ending May 22, 2010 Date: Mon, 24 May 2010 11:49:16 -0400 Message-ID: <15BCC4D99E8CBF48A2FA37A318CFF5C801209CCC@wma2-exch1.NELCO-USA.net> X-MS-Has-Attach: yes X-MS-TNEF-Correlator: Thread-Topic: weekly activity report for week ending May 22, 2010 thread-index: Acr7WKpdCelRCiocT1eBY2YN5Ma8DA== From: "Mike LeBlanc" <[email protected]> To: "Keith Berube" <[email protected]>, "Ken Tarbell" <[email protected]> X-OriginalArrivalTime: 24 May 2010 15:49:18.0361 (UTC) FILETIME=[AB546890:01CAFB58]

    Read the article

  • Eliminating Duplicate Printers on Print Server 2008 R2

    - by user123247
    I added a print server role to our new 2008 R2 server and started adding printers to it that will be available to Remote Desktop sessions. When I added the Remote Desktop services role, I specified printer redirection, thinking that would be a good thing. On the PCs where I am testing all this, I added the network printers locally so that they would have the printer available for local use. When I logon to the 2008 R2 server, I notice that the printers I added are out there twice... once on the 2008 R2 server and an additional time redirected from my PC. Is there some way to eliminate this duplication w/o eliminating redirection?

    Read the article

  • Duplicate incoming TCP traffic on Debian Squeeze

    - by Erwan Queffélec
    I have to test a homebrew server that accepts a lot of incoming TCP traffic on a single port. The protocol is homebrew as well. For testing purposes, I'd like to send this traffic both : - to the production server (say, listening on port 12345) - to the test server (say, listening on port 23456) My clients apps are "dumb" : they never read data back, and the server never replies anyway, my server only accepts connections, and do statistical computations and store/forward/service both raw and computed data. Actually, client apps and hardware are so simple there is no way I can tell clients to send their stream on both servers... And using "fake" clients is not good enough. What could be the simplest solution ? I can of course write an intermediary app that just copy incoming data and send it back to the testing server, pretending to be the client. I have a single server running Squeeze and have total control over it. Thanks in advance for your replies.

    Read the article

  • Linux (DUP!) ping packages

    - by Darkmage
    i cant seem t figure out what is going on here. The Linux machine I am using is running as a VM on a Win7 machine using Virtual Box running as a service. If i ping the win7 Host i get ok result. root@Virtual-Box:/home/glennwiz# ping -c 100000 -s 10 -i 0.02 192.168.1.100 PING 192.168.1.100 (192.168.1.100) 10(38) bytes of data. 18 bytes from 192.168.1.100: icmp_seq=1 ttl=128 time=1.78 ms 18 bytes from 192.168.1.100: icmp_seq=2 ttl=128 time=1.68 ms if i ping localhost im ok root@Virtual-Box:/home/glennwiz# ping -c 100000 -s 10 -i 0.02 localhost PING localhost (127.0.0.1) 10(38) bytes of data. 18 bytes from localhost (127.0.0.1): icmp_seq=1 ttl=64 time=0.255 ms 18 bytes from localhost (127.0.0.1): icmp_seq=2 ttl=64 time=0.221 ms but if i ping gateway i get DUP packets root@Virtual-Box:/home/glennwiz# ping -c 100000 -s 10 -i 0.02 192.168.1.1 PING 192.168.1.1 (192.168.1.1) 10(38) bytes of data. 18 bytes from 192.168.1.1: icmp_seq=1 ttl=64 time=1.27 ms 18 bytes from 192.168.1.1: icmp_seq=1 ttl=64 time=1.46 ms (DUP!) 18 bytes from 192.168.1.1: icmp_seq=2 ttl=64 time=22.1 ms 18 bytes from 192.168.1.1: icmp_seq=2 ttl=64 time=22.4 ms (DUP!) if i ping other machine on same LAN i stil get dups. pinging remote hosts also gives (DUP!) result root@Virtual-Box:/home/glennwiz# ping -c 100000 -s 10 -i 0.02 www.vg.no PING www.vg.no (195.88.55.16) 10(38) bytes of data. 18 bytes from www.vg.no (195.88.55.16): icmp_seq=1 ttl=245 time=10.0 ms 18 bytes from www.vg.no (195.88.55.16): icmp_seq=1 ttl=245 time=10.3 ms (DUP!) 18 bytes from www.vg.no (195.88.55.16): icmp_seq=2 ttl=245 time=10.3 ms 18 bytes from www.vg.no (195.88.55.16): icmp_seq=2 ttl=245 time=10.6 ms (DUP!)

    Read the article

  • Best way to Duplicate a Laptop's Hard Drive One-to-One

    - by Urda
    I have a Lenovo X61 Tablet computer, with a plain SATA drive inside. I have windows 7 and Ubuntu 9.10 dual booting on the computer. I want to back up both of these OS's, and their special partitions (Windows 7 has one, and of course the Linux Swap). I want a one-to-one backup, all of my mission critical data is already backed up, but I would like to get a snapshot, and store it on a larger file server at home for quick recovery. What is the best approach to do this?

    Read the article

  • Rewrite rule Mod_Proxy truncate file name [duplicate]

    - by Valerio Cicero
    This question already has an answer here: Redirect, Change URLs or Redirect HTTP to HTTPS in Apache - Everything You Ever Wanted How to Know about Mod_Rewrite Rules but Were Afraid to Ask 5 answers I search online for the solution, but nothing :(. I write this simple rule RewriteRule ^(.*)$ http://www.mysite.com/$1 [P,NE,QSA,L] In mysite.it i have an .htaccess with this rule and it's ok, but if i have a link "http://www.mysite.it/public/file name.html" the server point to "http://www.mysite.it/public/file" I try many solution but i can't solve. I try this and many shades of... RewriteRule ^(.*)(%20)(.*)$ "http://www.mysite.com/$1$3" [P,NE,QSA,L] Thanks!

    Read the article

  • How to Track Duplicate Downloads

    - by user1089173
    I have a product and we are running a campaign for it to allow users to download a trial version. My questions is there any way to detect multiple downloads from the same computer but behind a proxy. For example if a person uses a proxy server, changes his/her ip and downloads our trial version multiple times from different Ip's, is there a way to detect that it is the same person who's downloading the s/w. We use only Name and Email addresses to be entered before downloading our s/w and that can easily be created. We cannot add any other fields to the form (like phoneno.)

    Read the article

  • What are some good Server Name Themes/Categories [duplicate]

    - by Arian
    This question already has an answer here: What are the most manageable and interesting server naming schemes being used? [closed] 17 answers I need to create a naming scheme for my servers, but I am having a hard time come by a good category list to go by. I want something with an abundance of names to use, so as I scale my server count I won't run out. Some that I have heard being used is greek philosophers (plato) planet names (saturn, mercury, venus, mars) Mario Characters (mario, luigi, yoshi, toad) I feel like the above categories are kind of limited. What are some good naming scheme that you use?

    Read the article

  • One bigger Virtual Machine distributed across many OpenStack nodes [duplicate]

    - by flyer
    This question already has an answer here: Can a virtualized machine have the CPU and RAM resources of multiple underlying physical machines? 2 answers I just setup virtual machines on one hardware with Vagrant. I want to use a Puppet to configure them and next try to setup OpenStack. I am not sure If I am understanding how this should look at the end. Is it possible to have below architecture with OpenStack after all where I will run one Virtual Machine with Linux? ------------------------------- | VM with OS | ------------------------------- | NOVA | NOVA | NOVA | ------------------------------- | OpenStack | ------------------------------- | Node | Node | Node | ------------------------------- More details: In my environment Nodes are just virtual machines, but my question concerns separate Hardware nodes. If we imagine this Nodes(Novas) are placed on a separate machines (e.g. every has 4 cores) can I run one Virtual Machine across many OpenStack Nodes? Is it possible to aggregate the computation power of OpenStack in one virtual distributed operating system?

    Read the article

  • Duplicate pseudo terminals in linux

    - by bobtheowl2
    On a redhat box [ Red Hat Enterprise Linux AS release 4 (Nahant Update 3) ] Frequently we notice two people being assigned to the same pseudo terminal. For example: $who am i user1 pts/4 Dec 29 08:38 (localhost:13.0) user2 pts/4 Dec 29 09:43 (199.xxx.xxx.xxx) $who -m user1 pts/4 Dec 29 08:38 (localhost:13.0) user2 pts/4 Dec 29 09:43 (199.xxx.xxx.xxx) $whoami user2 This causes problems in a script because "who am i" returns two rows. I know there are differences between the two commands, and obviously we can change the script to fix the problem. But it still bothers me that two users are being returned with the same terminal. We suspect it may be related to dead sessions. Can anyone explain why two (non-unique) pts number are being assigned and/or how that can be prevented in the future?

    Read the article

  • Find out ESXi host from VM? [duplicate]

    - by Kyle Brandt
    This question already has an answer here: Determining the name of the VMware host of a VM guest - from the guest 3 answers I have a monitor that is running on all guests, Windows and Linux. I'd like to send back the Parent host of the guest with this monitor, How can I found out the VM host from within a ESXi VM? Ideally I'm looking for a universal way to do this from both Windows and Linux, but if they are separate paths that okay. Using something from VMWare tools would be fine, but I'm not currently using anything running the vSphere API.

    Read the article

  • Apache server doesn't create directory or file under www-data user [duplicate]

    - by Harkonnen
    This question already has an answer here: What permissions should my website files/folders have on a Linux webserver? 4 answers very newbie to Apache here I installed Apache 2.4 on my Arch server where I installed newznab (a newsgroups indexer). I have noticed that all files newznab needs to create are created under my login user, and not apache default user (www-data). I read here that it's bad security practice to allow www-data to write files. I agree. But as an apache newbie, I would like to know where (in the httpd.conf I suppose ?) the user allowed to write files can be configured, because I want another account to be allowed to write files instead of my main account.

    Read the article

  • Check if all files in a directory exists elsewhere

    - by aioobe
    I'm about to remove an old backup directory, but before doing so I'd like to make sure that all these files exist in a newer directory. Is there a tool for this? Or am I best off doing this "manually" using find, md5sum, sorting, comparing, etc? Clarification: If I have the following directory listings /path/to/old_backup/dir1/fileA /path/to/old_backup/dir1/fileB /path/to/old_backup/dir2/fileC and /path/to/new_backup/dir1/fileA /path/to/new_backup/dir2/fileB /path/to/new_backup/dir2/fileD then fileA and fileB exists in new_backup (fileA in its original directory, and fileB has moved from dir1 to dir2). fileC on the other hand is missing in new_backup and fileD has been created. In this situation I'd like the output to be something like fileC exists in old_backup, but not in new_backup.

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >