Search Results

Search found 4640 results on 186 pages for 'unique'.

Page 85/186 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • How to de-dupe identical photos that have a slightly different file size?

    - by GJ.
    I imported many photos using the new "camera import" feature of Dropbox. Many of those were duplicates of photos previously imported by direct copying from the camera. Strangely, the Dropbox import appears to slightly reduce the file size. E.g. here on the right is the file imported through Dropbox: Comparison of the two files using pdiff returns "Images are binary identical", but tools such as fdupes or even the Picasa "show duplicate files" feature, consider them as unique. What can be the cause of this file size change? Is there any way to undo it? Most importantly: how can I de-dupe efficiently without regard to file size comparison? (running pdiff comparison over all photo pairs in my library is obviously impractical...) A solution for either OS X or Windows would do.

    Read the article

  • Setting up 2 external monitors on a laptop with VGA splitter

    - by mike
    I have a laptop with a graphics card that supports 2 displays. I would like to know the easiest way to set it up so I can close my laptop lid and use 2 external monitors (unique displays). I use it primarily for office applications and video and want a quality, clear picture. The laptop has 1 VGA port and I have 2 24" 1920x1200 monitors that have VGA and DVI ports. So a few questions: Can I just use a VGA splitter? (seen mixed feedback on this) Would it a VGA to 2 DVI splitter give a better picture quality? (if it exists) Would I be better upgrading laptop to one with 2 digital ports ( I just see a lot with VGA and HDMI though) specs: Model: Toshiba Satellite C675-S106 (Windows 7) Graphics Card: Intel HD Graphics 3000 (supports 2 displays) Processor: Intel Core i3-2350M

    Read the article

  • how can I pass an environment variable through an ssh command?

    - by Ross Rogers
    How can I pass a value into an ssh command, such that the environment that is started on the host machine starts with a certain environment variable set to my choosing? EDIT: The goal is to pass the current kde desktop ( from dcop kwin KWinInterface currentDesktop ) to the new shell created so that I can pass back an nfs locations to my JEdit instance on the original server which is unique for each KDE desktop. ( Using a mechanism like emacsserver/emacsclient) The reason multiples ssh instances can be in flight at one time is because when I'm setting up my environment, I'm opening a bunch of different ssh instances to different machines.

    Read the article

  • Double Click to open Office docs is slow, File -> Open is fast.

    - by Keith
    I have 2 unique networks. They both share similar architecture: Windows 2003 SBS SP2 Running Symantec Endpoint Running Symantec Information Foundation Shared drives off a data partition Clients running Office 2003 or 2007 Connect to file server through mapped drives When users try to open a file from their local PC by double clicking, it will take 30-60 seconds to open. When they do File - Open, those same documents open up almost immediately. So far I've tried the following - CCleaner to parse the registry of outdated mapped drives - Disabled "using DDE" - Disabled A/V - Reboot Any ideas beyond that? Figured this question belongs here instead of SU since its the same issue on different networks.

    Read the article

  • root directory - www or public_html

    - by Phil Jackson
    Is the root directory where all files are kept (directly from accessing from FTP) always "www" or "public_html" depending on what OS? Or is it possible to rename this folder? And if so, what would be unique about this folder to be able to identify it? i.e. currently I just wrote this; my $root; my $ftp = Net::FTP->new($DB_ftpserver, Debug => 0) or die "Cannot connect to some.host.name: $@"; $ftp->login($DB_ftpuser, $DB_ftppass) or die "Cannot login ", $ftp->message; my @list = $ftp->dir; if( scalar @list != 0 ) { foreach( @list ){ if( $_ =~ m/www$/g ){ $root = "www"; last; }elsif( $_ =~ m/public_html$/g ){ $root = "public_html"; last; } } } but would not work if it has a different name. Any help much appreciated.

    Read the article

  • rsnapshot intervals in configuration file...

    - by Patrick
    A simple question about rsnapshot. In order to perform daily backups I'm going to add lines to cron in my Ubuntu. Then, why do I have also these lines in the rsnapshot.conf ? ######################################### # BACKUP INTERVALS # # Must be unique and in ascending order # # i.e. hourly, daily, weekly, etc. # ######################################### interval hourly 6 interval daily 7 interval weekly 4 #interval monthly 3 If I use cron, should I disable them ? thanks ps. I've just realized that in the crontab I still have "hourly" and "daily". Should I then uncomment only the one I use in the crontab ? And what's the point to specify hourly if it is already specified in cron ? I'm a bit confused. # crontab -e 0 */4 * * * /usr/local/bin/rsnapshot hourly 30 23 * * * /usr/local/bin/rsnapshot daily

    Read the article

  • Preventing SSH RSA host key warnings for change of key vs IP address

    - by Adam M-W
    I have a network with DHCP enabled, and also a computer that dual boots operating systems and has different SSH keys on each (and yes, I would like to keep different keys on each rather than copying the same identity/private key to each). Because the IP address does not change between operating systems because the MAC address is the same, when connecting to ssh, even when not using the IP address but the hostname via DNS/mDNS, I get the warning: Warning: the RSA host key for 'hostname' differs from the key for the IP address '192.168.1.172' Offending key for IP in /Users/user/.ssh/known_hosts:37 Matching host key in /Users/user/.ssh/known_hosts:38 Are you sure you want to continue connecting (yes/no)? How can I surpress the warning when the hostname differs from the IP address for that hostname, but retain the ability to check host keys are the same for each hostname? (each OS has a unique hostname)

    Read the article

  • What is a good partitioning design/scheme for a multi-boot *nix system?

    - by static
    I'm planning to install Debian on my server. I would like to design the partitioning scheme in such a way, that I could install one or more other *nix distributives on that. So, reading many articles I think this scheme could be a good one for the initial idea of multi-boot: /grub /swap /LVM VG1 (for OS1) -> /boot (LV1) / (LV2) /tmp (LV3) /var ... /var/log /home /LVM VG2 (for OS2) -> /boot / /tmp /var /var/log /home ... (other distros) /LVM VG0 (for data) -> /data (LV1) But I'm confused a little bit now: what should be the labels for these partitions (unique or not) and what should be the mounting points looking as (/home (OS1) mounted to /home as well as /home (OS2)...)?

    Read the article

  • VMWare - Windows XP guest licensing

    - by jcooper
    Hi, If I have a VMWare ESXi server with 4 Windows XP guests running on it, I understand that I need a separate license for each guest. Is there a way to simplify license compliance on these VMs? For example, I want to create a master VM image and boot it 4 times. By default those 4 VMs will have the same Windows activation key installed. Is there a simple solution for this? Or is it ok to do what I've described above provided I have 4 unique license keys on hand in case of audit? thanks!

    Read the article

  • Multiple users writing to one Samba mount point in OSX

    - by Sam
    I have an OSX box containing a script which writes a unique file to a Samba share. The first part of the script mounts the share. On the machine are 2 users- UserA and UserB. Each requires to run this script at any given time however only the user who mounted the share is able to write to it. I really need both users to have rwx access. Here is what I have tried: Mounting then chmod'ing the mountpoint (no effect- overruled by Samba server?) chmod'ing the mountpoint then mounting (same as above) sudo mount_smbfs Both users have admin privileges. Ideally a solution would be executable by one of the users (contained in the script) and not rely on mounting at machine boot time. Any ideas appreciated, thanks!

    Read the article

  • Windows Proxy Server advice

    - by Scott
    I have a webserver that currently has about 10 IP addresses. I have various clients that require a proxy server to route their internal traffic through. The load is not that great, so I'd like to have this ONE server act as a proxy server for 10 different clients, each client having their own unique IP on the server. The hardware is already setup, but I'm wondering what software solutions you guys recommend? I've looked at WinGate, Squid-Proxy, etc...but am pretty green with this. Maybe there's even a way to have Windows do this natively? I'm running Windows Server 2008, 32 bit.

    Read the article

  • Hourly CRON task running more frequently than one hour

    - by Justin
    I have a cron task that calls a special PHP script via wget. Here is the crontab entry: 0 * * * * wget http://www.... It will work perfect for several days, running on the hour. However, after a few days the cron job will start to be called several times an hour. I have never seen CRON drift like this, so I imagine it can't really be a CRON issue. However, the logs of the script that is called clearly show it running several times an hour. Server details: Ubuntu Luci Apache MySQL PHP5 Time is showing correct @ command line Server is setup to sync with a NTP server In order for the script to run it must be passed a unique 50-character hash key in the URL, so this script isn't being called from any other source accidentally. What might cause CRON to drift like this?

    Read the article

  • HTTPS version of page throws 404, regular HTTP appears fine?

    - by Ryan
    I'm having a strange issue with a website in IIS on Windows Server 2003. It has a valid wild card certificate on it, however when I use HTTPS on the page I get a 404 not found. Without HTTPS it shows up fine. Also, if I go to the domain root of the site using HTTP the homepage shows up, but with HTTPS it REDIRECTS ME to a totally different website installed on the same IIS server. I am quite confused. I tried giving each site a unique IP address but it didn't change anything, I also tried changing the SSL ports, no luck. This IIS is setup to run PHP also. What could I check to resolve this?

    Read the article

  • Microsoft CALs for Domain Controller

    - by Damo
    I am designing a network and I've come to the point of specifying out the number of CALs required for this network. Microsoft licensing has always confused me, it's just not always clear to me. I plan to have 1 2008 std domain controller, another 2008 server (not a domain controller) and 200 Windows 7 devices connected to the domain for domain services. The 200 W7 devices will all authenticate to the domain controller with the same domain account. (this is a special type of network, not a user workstation network) Therefore, do I need to purchase 200 CALS for the 200 devices, or can I purchase say 10 CALS (user CALS) as the amount of unique user accounts is very low. Many thanks for looking.

    Read the article

  • Broadband Traffic Question

    - by rutherford
    I have a broadband ADSL line with plus.net in the UK. Having checked the modem there is no firewall or any weird features enabled. But since I arrived at the apartment (the broadband already being installed), I cannot log into Twitter nor update any of my wordpress blogs (I can browse them and log in, but cannot save any edits or new posts). It only seems to affect these two sites in their unique ways. If I take the netbook I use in this place out to say a McDonalds or some other wifi access point then these sites work fine again. Anyone know what could possibly be preventing access of the pages in question? The only thing common to these pages are the POST response they are expecting. But POST form submission works fine on other sites...

    Read the article

  • Multiple SSH private keys for the same host

    - by Sencha
    How can I store 2 different private SSH keys for the same host? I have tried 2 entries in /etc/ssh/ssh_config for the same host with the different keys, and I've also tried to put both keys in the same file and referencing it from one hosts setting, however both do not work. More detail: I'm running Ubuntu server (12.04) and I want to connect to GitHub via SSH to download the latest source for my projects. There are multiple projects running on the same server and each project has a GitHub repo with it's own unique deloyment key-pair. So the host is always the same (github.com) but the keys need to be different depending on which repo I'm using. Different /etc/ssh/ssh_config versions I have tried: Host github.com IdentityFile /etc/ssh/my_project_1_github_deploy_key StrictHostKeyChecking no Host github.com IdentityFile /etc/ssh/my_project_2_github_deploy_key StrictHostKeyChecking no and this with both keys in the same file: Host github.com IdentityFile /etc/ssh/my_project_github_deploy_keys StrictHostKeyChecking no I've had no luck with either. Any help would be greatly appreciated!

    Read the article

  • Managing service passwords with Puppet

    - by Jeff Ferland
    I'm setting up my Bacula configuration in Puppet. One thing I want to do is ensure that each password field is different. My current thought is to hash the hostname with a secret value that would ensure each file daemon has a unique password and that password can be written to both the director configuration and the file server. I definitely don't want to use one universal password as that would permit anybody who might compromise one machine to get access to any machine through Bacula. Is there another way to do this other than using a hash function to generate the passwords? Clarification: This is NOT about user accounts for services. This is about the authentication tokens (to use another term) in the client / server files. Example snippet: Director { # define myself Name = <%= hostname $>-dir QueryFile = "/etc/bacula/scripts/query.sql" WorkingDirectory = "/var/lib/bacula" PidDirectory = "/var/run/bacula" Maximum Concurrent Jobs = 3 Password = "<%= somePasswordFunction =>" # Console password Messages = Daemon }

    Read the article

  • @OneToMany association joining on the wrong field

    - by april26
    I have 2 tables, devices which contains a list of devices and dev_tags, which contains a list of asset tags for these devices. The tables join on dev_serial_num, which is the primary key of neither table. The devices are unique on their ip_address field and they have a primary key identified by dev_id. The devices "age out" after 2 weeks. Therefore, the same piece of hardware can show up more than once in devices. I mention that to explain why there is a OneToMany relationship between dev_tags and devices where it seems that this should be a OneToOne relationship. So I have my 2 entities @Entity @Table(name = "dev_tags") public class DevTags implements Serializable { private Integer tagId; private String devTagId; private String devSerialNum; private List<Devices> devices; @Id @GeneratedValue @Column(name = "tag_id") public Integer getTagId() { return tagId; } public void setTagId(Integer tagId) { this.tagId = tagId; } @Column(name="dev_tag_id") public String getDevTagId() { return devTagId; } public void setDevTagId(String devTagId) { this.devTagId = devTagId; } @Column(name="dev_serial_num") public String getDevSerialNum() { return devSerialNum; } public void setDevSerialNum(String devSerialNum) { this.devSerialNum = devSerialNum; } @OneToMany(mappedBy="devSerialNum") public List<Devices> getDevices() { return devices; } public void setDevices(List<Devices> devices) { this.devices = devices; } } and this one public class Devices implements java.io.Serializable { private Integer devId; private Integer officeId; private String devSerialNum; private String devPlatform; private String devName; private OfficeView officeView; private DevTags devTag; public Devices() { } @Id @GeneratedValue(strategy = IDENTITY) @Column(name = "dev_id", unique = true, nullable = false) public Integer getDevId() { return this.devId; } public void setDevId(Integer devId) { this.devId = devId; } @Column(name = "office_id", nullable = false, insertable=false, updatable=false) public Integer getOfficeId() { return this.officeId; } public void setOfficeId(Integer officeId) { this.officeId = officeId; } @Column(name = "dev_serial_num", nullable = false, length = 64, insertable=false, updatable=false) @NotNull @Length(max = 64) public String getDevSerialNum() { return this.devSerialNum; } public void setDevSerialNum(String devSerialNum) { this.devSerialNum = devSerialNum; } @Column(name = "dev_platform", nullable = false, length = 64) @NotNull @Length(max = 64) public String getDevPlatform() { return this.devPlatform; } public void setDevPlatform(String devPlatform) { this.devPlatform = devPlatform; } @Column(name = "dev_name") public String getDevName() { return devName; } public void setDevName(String devName) { this.devName = devName; } @ManyToOne(fetch = FetchType.LAZY) @JoinColumn(name = "office_id") public OfficeView getOfficeView() { return officeView; } public void setOfficeView(OfficeView officeView) { this.officeView = officeView; } @ManyToOne() @JoinColumn(name="dev_serial_num") public DevTags getDevTag() { return devTag; } public void setDevTag(DevTags devTag) { this.devTag = devTag; } } I messed around a lot with @JoinColumn(name=) and the mappedBy attribute of @OneToMany and I just cannot get this right. I finally got the darn thing to compile, but the query is still trying to join devices.dev_serial_num to dev_tags.tag_id, the @Id for this entity. Here is the transcript from the console: 13:12:16,970 INFO [STDOUT] Hibernate: select devices0_.office_id as office5_2_, devices0_.dev_id as dev1_2_, devices0_.dev_id as dev1_156_1_, devices0_.dev_name as dev2_156_1_, devices0_.dev_platform as dev3_156_1_, devices0_.dev_serial_num as dev4_156_1_, devices0_.office_id as office5_156_1_, devtags1_.tag_id as tag1_157_0_, devtags1_.comment as comment157_0_, devtags1_.dev_serial_num as dev3_157_0_, devtags1_.dev_tag_id as dev4_157_0_ from ond.devices devices0_ left outer join ond.dev_tags devtags1_ on devices0_.dev_serial_num=devtags1_.tag_id where devices0_.office_id=? 13:12:16,970 INFO [IntegerType] could not read column value from result set: dev4_156_1_; Invalid value for getInt() - 'FDO1129Y2U4' 13:12:16,970 WARN [JDBCExceptionReporter] SQL Error: 0, SQLState: S1009 13:12:16,970 ERROR [JDBCExceptionReporter] Invalid value for getInt() - 'FDO1129Y2U4' That value for getInt() 'FD01129Y2U4' is a serial number, definitely not an Int! What am I missing/misunderstanding here? Can I join 2 tables on any fields I want or does at least one have to be a primary key?

    Read the article

  • How to change subversion working copy UUID?

    - by Ioan
    I've recently updated Subversion repositories from an old 1.2.3 version to 1.6.0 via svnadmin dump/load. The old repositories all used the same UUID (repositories were created using by copying a template repository). I've changed the UUID on a couple of the new repositories via svnadmin setuuid to be unique. I can't just relocate my existing working copies of those repositories because the UUIDs are different. I know about exporting the working copy and checking out from the new repository, but I was wondering whether there was a way to just change the UUID of the working copy in-place, like what svnadmin setuuid does for repositories.

    Read the article

  • IPv4 NameVirtualHost, IPv6 VirtualHost

    - by MadHatter
    Like many of us, I have an apache server (2.2.15, plus patches) with a lot of virtual hosts on it. More than I have IPv4 addresses, to be sure, which is why I use NameVirtualHost to run lots of them on the same IPv4 address. I'm busily trying to get everything I do IPv6-enabled. This server now has a routed /64, which gives me an awful lot of v6 addresses to throw around. What I'm trying to find is a simple way to tell each v4-NameVirtualHost that it should also function as a VirtualHost on a unique ipv6 address. I really, really don't want to have to define each virtual host twice. Does anyone know of an elegant way to do this? Or to do something comparable, in case I've embedded any dangerously-ignorant assumptions in my question?

    Read the article

  • Checking version of Applications installed in ~/Applications with unknown username

    - by ridogi
    I'd like to check the version of Firefox through Apple Remote Desktop of all managed computers. I have written this, but it only checks for Firefox in /Applications /bin/cat /Applications/Firefox.app/Contents/Info.plist | grep -A 1 CFBundleShortVersionString | grep string | sed 's/[/]//' | sed 's/<string>//g' For standard users Firefox auto update breaks if it is in /Applications so I instead have it installed in ~/Applications I'd like to check that copy (if it exists), but I can't specify the path in the command since it is unique to each computer. For example: /Users/jon/Applications/Firefox.app /Users/arya/Applications/Firefox.app Presumably I want to use find and pipe the result to my command. This should work for 10.6 through 10.8

    Read the article

  • How to identify which website on my instance is receiving lots of traffic?

    - by Bob Flemming
    I am new to server administration and have just setup a new quad core instance which hosts around 15 websites. Over the past couple of days my server load has been averaging at around 15.00. I believe it is because of one (or maybe more) websites are getting spammed by spambots. Typing 'top' at the command line shows many processes from user 'www-data' which indicates lots of web traffic. Is there an easy way identify which one of my sites is taking a hammering? Reading the apache error logs is a very difficult tasks as most of the websites receive daily traffic of 10,000 + unique users. Any help would be appreciated!

    Read the article

  • HP Wireless Printer not working

    - by Omri Spector
    I have installed an HP DeskJet 4620 driver on a win 7 machine. All works perfectly for several days, and than printing is not longer possible. Instead I get the message: "Unable to communicate with printer". This happened on every Win 7 PC I tried, and none of the HP/MS sites contain any relevant info... (Posting this so that the answer appears online, as I did solve it after much work) Solution: It appears that HP installation puts a unique "port" called "HP Network re-discovery". It stops working after some time (possibly after the first time the printer/pc enter sleep mode). BUT, the standard MS TCP port works just fine. So: Go to "Printers" Right click Printer Click "Printer properties" and then "Printer" or "Fax" (for both - do all this twice) Click "Add Port..." Select "Standard TCP Port" Fill in details Move printer to use the new port by un-checking the old one and checking the new one Happy printing.

    Read the article

  • Forums that compell one to read? [closed]

    - by Alex
    I want to create a forum for a very specific, somewhat fight-club-like area of information. The participants need to be unique, not people I know. To be precise, it needs to be an entirely random set of participants. To do this I am going on the assumption that whatever a potential viewer sees when he visits this particular domain he will be COMPELLED to go further. He feels an intense urge within him to read more. I need forum software that will evoke this feeling regardless of the content. Minimal and aesthetically fluid and functional. It must have.. zen.. What are my options?

    Read the article

  • Dealing with SMTP invalid command attack

    - by mark
    One of our semi-busy mail servers (sendmail) has had a lot of inbound connections over the past few days from hosts that are issuing garbage commands. In the past two days: incoming smtp connections with invalid commands from 39,000 unique IPs the IPs come from various ranges all over the world, not just a few networks that I can block the mail server serves users throughout north america, so I can't just block connections from unknown IPs sample bad commands: http://pastebin.com/4QUsaTXT I am not sure what someone is trying to accomplish with this attack, besides annoy me. any ideas what this is about, or how to effectively deal with it?

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >