Search Results

Search found 14837 results on 594 pages for 'duplicate ip'.

Page 104/594 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Remove identical files in UNIX

    - by Thrawn
    Hi all, I'm dealing with a large amount (30,000) files of about 10MB in size. Some of them (I estimate 2%) are actually duplicated, and I need to keep only a copy for every duplicated pair (or triplet). Would you suggest me an efficient way to do that? I'm working on unix. Thank you :-)

    Read the article

  • Duplicate Items Using Join in NHibernate Map

    - by Colin Bowern
    I am trying to retrieve the individual detail rows without having to create an object for the parent. I have a map which joins a parent table with the detail to achieve this: Table("UdfTemplate"); Id(x => x.Id, "Template_Id"); Map(x => x.FieldCode, "Field_Code"); Map(x => x.ClientId, "Client_Id"); Join("UdfFields", join => { join.KeyColumn("Template_Id"); join.Map(x => x.Name, "COLUMN_NAME"); join.Map(x => x.Label, "DISPLAY_NAME"); join.Map(x => x.IsRequired, "MANDATORY_FLAG") .CustomType<YesNoType>(); join.Map(x => x.MaxLength, "DATA_LENGTH"); join.Map(x => x.Scale, "DATA_SCALE"); join.Map(x => x.Precision, "DATA_PRECISION"); join.Map(x => x.MinValue, "MIN_VALUE"); join.Map(x => x.MaxValue, "MAX_VALUE"); }); When I run the query in NH using: Session.CreateCriteria(typeof(UserDefinedField)) .Add(Restrictions.Eq("FieldCode", code)).List<UserDefinedField>(); I get back the first row three times as opposed to the three individual rows it should return. Looking at the SQL trace in NH Profiler the query appears to be correct. The problem feels like it is in the mapping but I am unsure how to troubleshoot that process. I am about to turn on logging to see what I can find but I thought I would post here in case someone with experience mapping joins knows where I am going wrong.

    Read the article

  • Swingworker producing duplicate output/output out of order?

    - by Stefan Kendall
    What is the proper way to guarantee delivery when using a SwingWorker? I'm trying to route data from an InputStream to a JTextArea, and I'm running my SwingWorker with the execute method. I think I'm following the example here, but I'm getting out of order results, duplicates, and general nonsense. Here is my non-working SwingWorker: class InputStreamOutputWorker extends SwingWorker<List<String>,String> { private InputStream is; private JTextArea output; public InputStreamOutputWorker(InputStream is, JTextArea output) { this.is = is; this.output = output; } @Override protected List<String> doInBackground() throws Exception { byte[] data = new byte[4 * 1024]; int len = 0; while ((len = is.read(data)) > 0) { String line = new String(data).trim(); publish(line); } return null; } @Override protected void process( List<String> chunks ) { for( String s : chunks ) { output.append(s + "\n"); } } }

    Read the article

  • MySQL - Skip Duplicate WordPress Entries

    - by 55skidoo
    I'm writing a script to display the 10 most recently "active" WordPress blog posts (i.e. those with the most recent comments). Problem is, the list has a lot of duplicates. I'd like to weed out the duplicates. Is there an easy way to do this by changing the MySQL query (like IGNORE, WHERE) or some other means? Here's what I have so far: <?php function cd_recently_active() { global $wpdb, $comments, $comment; $number = 10; //how many recently active posts to display? enter here if ( !$comments = wp_cache_get( 'recent_comments', 'widget' ) ) { $comments = $wpdb->get_results("SELECT comment_date, comment_author, comment_author_url, comment_ID, comment_post_ID, comment_content FROM $wpdb->comments WHERE comment_approved = '1' ORDER BY comment_date_gmt DESC LIMIT $number"); wp_cache_add( 'recent_comments', $comments, 'widget' ); } ?>

    Read the article

  • Suggestions for duplicate file finder algorithm (using C)

    - by Andrei Ciobanu
    Hello, I wanted to write a program that test if two files are duplicates (have exactly the same content). First I test if the files have the same sizes, and if they have i start to compare their contents. My first idea, was to "split" the files into fixed size blocks, then start a thread for every block, fseek to startup character of every block and continue the comparisons in parallel. When a comparison from a thread fails, the other working threads are canceled, and the program exits out of the thread spawning loop. The code looks like this: dupf.h #ifndef __NM__DUPF__H__ #define __NM__DUPF__H__ #define NUM_THREADS 15 #define BLOCK_SIZE 8192 /* Thread argument structure */ struct thread_arg_s { const char *name_f1; /* First file name */ const char *name_f2; /* Second file name */ int cursor; /* Where to seek in the file */ }; typedef struct thread_arg_s thread_arg; /** * 'arg' is of type thread_arg. * Checks if the specified file blocks are * duplicates. */ void *check_block_dup(void *arg); /** * Checks if two files are duplicates */ int check_dup(const char *name_f1, const char *name_f2); /** * Returns a valid pointer to a file. * If the file (given by the path/name 'fname') cannot be opened * in 'mode', the program is interrupted an error message is shown. **/ FILE *safe_fopen(const char *name, const char *mode); #endif dupf.c #include <errno.h> #include <pthread.h> #include <stdio.h> #include <stdlib.h> #include <string.h> #include <sys/types.h> #include <sys/stat.h> #include <unistd.h> #include "dupf.h" FILE *safe_fopen(const char *fname, const char *mode) { FILE *f = NULL; f = fopen(fname, mode); if (f == NULL) { char emsg[255]; sprintf(emsg, "FOPEN() %s\t", fname); perror(emsg); exit(-1); } return (f); } void *check_block_dup(void *arg) { const char *name_f1 = NULL, *name_f2 = NULL; /* File names */ FILE *f1 = NULL, *f2 = NULL; /* Streams */ int cursor = 0; /* Reading cursor */ char buff_f1[BLOCK_SIZE], buff_f2[BLOCK_SIZE]; /* Character buffers */ int rchars_1, rchars_2; /* Readed characters */ /* Initializing variables from 'arg' */ name_f1 = ((thread_arg*)arg)->name_f1; name_f2 = ((thread_arg*)arg)->name_f2; cursor = ((thread_arg*)arg)->cursor; /* Opening files */ f1 = safe_fopen(name_f1, "r"); f2 = safe_fopen(name_f2, "r"); /* Setup cursor in files */ fseek(f1, cursor, SEEK_SET); fseek(f2, cursor, SEEK_SET); /* Initialize buffers */ rchars_1 = fread(buff_f1, 1, BLOCK_SIZE, f1); rchars_2 = fread(buff_f2, 1, BLOCK_SIZE, f2); if (rchars_1 != rchars_2) { /* fread failed to read the same portion. * program cannot continue */ perror("ERROR WHEN READING BLOCK"); exit(-1); } while (rchars_1-->0) { if (buff_f1[rchars_1] != buff_f2[rchars_1]) { /* Different characters */ fclose(f1); fclose(f2); pthread_exit("notdup"); } } /* Close streams */ fclose(f1); fclose(f2); pthread_exit("dup"); } int check_dup(const char *name_f1, const char *name_f2) { int num_blocks = 0; /* Number of 'blocks' to check */ int num_tsp = 0; /* Number of threads spawns */ int tsp_iter = 0; /* Iterator for threads spawns */ pthread_t *tsp_threads = NULL; thread_arg *tsp_threads_args = NULL; int tsp_threads_iter = 0; int thread_c_res = 0; /* Thread creation result */ int thread_j_res = 0; /* Thread join res */ int loop_res = 0; /* Function result */ int cursor; struct stat buf_f1; struct stat buf_f2; if (name_f1 == NULL || name_f2 == NULL) { /* Invalid input parameters */ perror("INVALID FNAMES\t"); return (-1); } if (stat(name_f1, &buf_f1) != 0 || stat(name_f2, &buf_f2) != 0) { /* Stat fails */ char emsg[255]; sprintf(emsg, "STAT() ERROR: %s %s\t", name_f1, name_f2); perror(emsg); return (-1); } if (buf_f1.st_size != buf_f2.st_size) { /* File have different sizes */ return (1); } /* Files have the same size, function exec. is continued */ num_blocks = (buf_f1.st_size / BLOCK_SIZE) + 1; num_tsp = (num_blocks / NUM_THREADS) + 1; cursor = 0; for (tsp_iter = 0; tsp_iter < num_tsp; tsp_iter++) { loop_res = 0; /* Create threads array for this spawn */ tsp_threads = malloc(NUM_THREADS * sizeof(*tsp_threads)); if (tsp_threads == NULL) { perror("TSP_THREADS ALLOC FAILURE\t"); return (-1); } /* Create arguments for every thread in the current spawn */ tsp_threads_args = malloc(NUM_THREADS * sizeof(*tsp_threads_args)); if (tsp_threads_args == NULL) { perror("TSP THREADS ARGS ALLOCA FAILURE\t"); return (-1); } /* Initialize arguments and create threads */ for (tsp_threads_iter = 0; tsp_threads_iter < NUM_THREADS; tsp_threads_iter++) { if (cursor >= buf_f1.st_size) { break; } tsp_threads_args[tsp_threads_iter].name_f1 = name_f1; tsp_threads_args[tsp_threads_iter].name_f2 = name_f2; tsp_threads_args[tsp_threads_iter].cursor = cursor; thread_c_res = pthread_create( &tsp_threads[tsp_threads_iter], NULL, check_block_dup, (void*)&tsp_threads_args[tsp_threads_iter]); if (thread_c_res != 0) { perror("THREAD CREATION FAILURE"); return (-1); } cursor+=BLOCK_SIZE; } /* Join last threads and get their status */ while (tsp_threads_iter-->0) { void *thread_res = NULL; thread_j_res = pthread_join(tsp_threads[tsp_threads_iter], &thread_res); if (thread_j_res != 0) { perror("THREAD JOIN FAILURE"); return (-1); } if (strcmp((char*)thread_res, "notdup")==0) { loop_res++; /* Closing other threads and exiting by condition * from loop. */ while (tsp_threads_iter-->0) { pthread_cancel(tsp_threads[tsp_threads_iter]); } } } free(tsp_threads); free(tsp_threads_args); if (loop_res > 0) { break; } } return (loop_res > 0) ? 1 : 0; } The function works fine (at least for what I've tested). Still, some guys from #C (freenode) suggested that the solution is overly complicated, and it may perform poorly because of parallel reading on hddisk. What I want to know: Is the threaded approach flawed by default ? Is fseek() so slow ? Is there a way to somehow map the files to memory and then compare them ?

    Read the article

  • Replace duplicate spaces with single space in TSQL

    - by Chris
    I need to ensure that a given field does not have more than one space (not concerned about all white space, just space) between characters. So 'single spaces only' Needs to turn into 'single spaces only' The below will not work select replace('single spaces only',' ',' ') as it would result in 'single spaces only' I would really prefer to stick with native TSQL rather than a CLR based solution. Thoughts?

    Read the article

  • Where to find great proxy servers for testing GeoIP services?

    - by Andreas
    We would like to test a GeoIP-Service. Therefore we need to go to the site with an IP from another country. There are a lot of free proxy lists like http://nntime.com/proxy-country/ The problem with them is, that only the CoDeen-Proxies are working. But with CoDeen you can't select your country of origin (the same as with TOR). You get redirected to a random proxy in the network. Where to find good proxy server for testing the GeoIP Services? Free proxy servers would be great, but if they cost something small that doesn't matter.

    Read the article

  • PHP Associative Array Duplicate Key?

    - by Steven
    Hello, I have an associative array, however when I add values to it using the below function it seems to overwrite the same keys. Is there a way to have multiple of the same keys with different values? Or is there another form of array that has the same format? I want to have 42=56 42=86 42=97 51=64 51=52 etc etc function array_push_associative(&$arr) { $args = func_get_args(); foreach ($args as $arg) { if (is_array($arg)) { foreach ($arg as $key => $value) { $arr[$key] = $value; $ret++; } }else{ $arr[$arg] = ""; } } return $ret; }

    Read the article

  • Duplicate a Drupal installation from one server to another

    - by irot
    Hello. I have been developing a Drupal 6 site on my PC using XAMPP. I'm done now, and everything looks peachy. Problem is, I need to put all my content (including custom modules and themes) up onto a staging server which only has a fresh Drupal 6 install on it. I can't imagine having to set up all my custom content types and whatnot all over again on the staging server. So I ask, how does one go about doing what I need to do? Which is essentially duplicating my Drupal install from my PC, to the staging server. The staging server is running Linux, and I develop on a Windows PC, if that helps. Thanks in advance.

    Read the article

  • snmp, how to retrieve ip connected to the router with MIB-II

    - by dany
    Hi all, I want to create a program that acts as manager and that queries the router (or sets a trap) to obtain the list of ip connected to it. My router has these functionalities: SNMP v1, v2c, built-in MIB-I, MIB-II agent. Is it possible to retrieve these informations quering the MIB-II agent of the router in a standard way (not vendor dependent)? Bye

    Read the article

  • arp protocol, mac and ip

    - by lolalola
    Hello, interested in ARP and wanted to check. ARP protocol is used found MAC and IP addresses, yes? How is it different from this: IPHostEntry iphostentry = Dns.GetHostByName(strHostName);

    Read the article

  • Visual Studio bug ~ can anyone duplicate?

    - by drachenstern
    In Visual Studio 2010 Pro (Version 10.0.30319.1 RTMRel), I noticed tonight that for some reason I kept getting a wider window for quick find, but thought I was losing my mind. So I exited, restarted, etc to verify. Here's my repro steps Open existing project (I don't think it matters which one) Press ctrlf and give it something to search for (?) Press enter Press ctrlf Press enter goto 4 Can you reproduce a slowly expanding quick find window? Do I have some sort of wacky bugged out system? I'ld obviously like to submit a bug report to MS if this is indeed a viable repro.

    Read the article

  • GCJ Creates duplicate dummy symbol

    - by vickirk
    Hi, I'm trying to build a java application with gcj but getting the error below. multiple definition of `java resource .dummy' gcj versions are 4.4.1 on Ubuntu and 4.3.4 on cygwin/windows XP and I'm building it with gcj --main=my.MainClass --classpath=my my/*java Anyone seen this or know a workaround without installing an earlier version of gcj. If that is the way to do it does anyone know how to do that on cygwin or will I have to build it? Here is a minimal test case that gives this error public class A { public static void main(String[] args) { System.out.println(new B()); } } public class B { public String toString() { return "Hello"; } } gcj --main=A src/A.java src/B.java

    Read the article

  • migrating from Prototype to jQuery in Rails, having trouble with duplicate get request

    - by aressidi
    I'm in the process of migrating from Prototype to jQuery and moving all JS outside of the view files. All is going fairly well with one exception. Here's what I'm trying to do, and the problem I'm having. I have a diary where users can update records in-line in the page like so: user clicks 'edit' link to edit an entry in the diary a get request is performed via jQuery and an edit form is displayed allowing the user to modify the record user updates the record, the form disappears and the updated record is shown in place of the form All of that works so far. The problem arises when: user updates a record user clicks 'edit' to update another record in this case, the edit form is shown twice! In firebug I get a status code 200 when the form shows, and then moments later, another edit form shows again with a status code of 304 I only want the form to show once, not twice. The form shows twice only after I update a record, otherwise everything works fine. Here's the code, any ideas? I think this might have to do with the fact that in food_item_update.js I call the editDiaryEntry() after a record is updated, but if I don't call that function and try and update the record after it's been modified, then it just spits up the .js.erb response on the screen. That's also why I have the editDiaryEntry() in the add_food.js.erb file. Any help would be greatly appreciated. diary.js jQuery(document).ready(function() { postFoodEntry(); editDiaryEntry(); initDatePicker(); }); function postFoodEntry() { jQuery('form#add_entry').submit(function(e) { e.preventDefault(); jQuery.post(this.action, jQuery(this).serialize(), null, "script"); // return this }); } function editDiaryEntry() { jQuery('.edit_link').click(function(e) { e.preventDefault(); // This should look to see if one version of this is open... if (jQuery('#edit_container_' + this.id).length == 0 ) { jQuery.get('/diary/entry/edit', {id: this.id}, null, "script"); } }); } function closeEdit () { jQuery('.close_edit').click(function(e) { e.preventDefault(); jQuery('.entry_edit_container').remove(); jQuery("#entry_" + this.id).show(); }); } function updateDiaryEntry() { jQuery('.edit_entry_form').submit(function(e) { e.preventDefault(); jQuery.post(this.action, $(this).serialize(), null, "script"); }); } function initDatePicker() { jQuery("#date, #edit_date").datepicker(); }; add_food.js.erb jQuery("#entry_alert").show(); jQuery('#add_entry')[ 0 ].reset(); jQuery('#diary_entries').html("<%= escape_javascript(render :partial => 'members/diary/diary_entries', :object => @diary, :locals => {:record_counter => 0, :date_header => 0, :edit_mode => @diary_edit}, :layout => false ) %>"); jQuery('#entry_alert').html("<%= escape_javascript(render :partial => 'members/diary/entry_alert', :locals => {:type => @type, :message => @alert_message}) %>"); jQuery('#entry_alert').show(); setTimeout(function() { jQuery('#entry_alert').fadeOut('slow'); }, 5000); editDiaryEntry(); food_item_edit.js.erb jQuery("#entry_<%= @entry.id %>").hide(); jQuery("#entry_<%= @entry.id %>").after("<%= escape_javascript(render :partial => 'members/diary/food_item_edit', :locals => {:user_food_profile => @entry}) %>"); closeEdit(); updateDiaryEntry(); initDatePicker(); food_item_update.js jQuery("#entry_<%= @entry.id %>").replaceWith("<%= escape_javascript(render :partial => 'members/diary/food_item', :locals => {:entry => @entry, :total_calories => 0}) %>"); jQuery('.entry_edit_container').remove(); editDiaryEntry();

    Read the article

  • SQL Query Returning Duplicate Results

    - by Jesse Bunch
    Hi, I've been working out this query now for a while and I thought I had it where I wanted it, but apparently not. There are two records in the database (orders). The query should return two different rows, but instead returns two rows that have exactly the same values. I think it may be something to do with the GROUP BY or derived tables I'm using but my eyes are tired and not seeing the problem. Can any of you help? Thanks in advance. SELECT orders.billerID, orders.invoiceDate, orders.txnID, orders.bName, orders.bStreet1, orders.bStreet2, orders.bCity, orders.bState, orders.bZip, orders.bCountry, orders.sName, orders.sStreet1, orders.sStreet2, orders.sCity, orders.sState, orders.sZip, orders.sCountry, orders.paymentType, orders.invoiceNotes, orders.pFee, orders.shipping, orders.tax, orders.reasonCode, orders.txnType, orders.customerID, customers.firstName AS firstName, customers.lastName AS lastName, customers.businessName AS businessName, orderStatus.statusName AS orderStatus, IFNULL(orderItems.itemTotal, 0.00) + orders.shipping + orders.tax AS orderTotal, IFNULL(orderItems.itemTotal, 0.00) + orders.shipping + orders.tax - IFNULL(payments.totalPayments, 0.00) AS orderBalance FROM orders LEFT JOIN customers ON orders.customerID = customers.id LEFT JOIN orderStatus ON orders.orderStatus = orderStatus.id LEFT JOIN ( SELECT orderItems.orderID, SUM(orderItems.itemPrice * orderItems.itemQuantity) as itemTotal FROM orderItems GROUP BY orderItems.orderID ) orderItems ON orderItems.orderID = orders.id LEFT JOIN ( SELECT payments.orderID, SUM(payments.amount) as totalPayments FROM payments GROUP BY payments.orderID ) payments ON payments.orderID = orders.id

    Read the article

  • T4MVC and duplicate controller names in different areas

    - by artvolk
    In my application I have controller named Snippets both in default area (in application root) and in my area called Manage. I use T4MVC and custom routes, like this: routes.MapRoute( "Feed", "feed/", MVC.Snippets.Rss() ); And I get this error: Multiple types were found that match the controller named 'snippets'. This can happen if the route that services this request ('{controller}/{action}/{id}/') does not specify namespaces to search for a controller that matches the request. If this is the case, register this route by calling an overload of the 'MapRoute' method that takes a 'namespaces' parameter. The request for 'snippets' has found the following matching controllers: Snippets.Controllers.SnippetsController Snippets.Areas.Manage.Controllers.SnippetsController I know that there are overloads for MapRoute that take namespaces argument, but there are no such overloads with T4MVC support. May be I'm missing something? The possible syntax can be: routes.MapRoute( "Feed", "feed/", MVC.Snippets.Rss(), new string[] {"Snippets.Controllers"} ); or, it seems quite good to me to have namespace as T4MVC property: routes.MapRoute( "Feed", "feed/", MVC.Snippets.Rss(), new string[] {MVC.Snippets.Namespace} ); Thanks in advance!

    Read the article

  • Get list of duplicate rows in MySql

    - by user347033
    Hi, i have a table like this ID nachname vorname 1 john doe 2 john doe 3 jim doe 4 Michael Knight I need a query that will return all the fields (select *) from the records that have the same nachname and vorname (in this case, records 1 and 2). Can anyone help me with this? Thanks

    Read the article

  • php: geting ip addres

    - by Syom
    i want to get an ip addres of visitors. could you tell me what element of $_SERVER[] i should use? $_SERVER['HTTP_CLIENT_IP']; $_SERVER['HTTP_X_FORWARDED_FOR']; or $_SERVER['REMOTE_ADDR']; thanks

    Read the article

  • remove duplicate values from a multidimensional arrays in php

    - by haseeb
    hi this is my array , i need to remove duplicating values , please hekp me out Array ( [0] = Array ( [0] = * garfield calicut Address: tanil nadu chennai0696955666 About Company: re stored. [1] = 0.0004 [2] = 0 ) [1] = Array ( [0] = * gamut Address: ashok puram calicut9865326921 About Company: You've come. * garfield calicut Address: tanil nadu chennai0696955666 About Company: re stored. * Hyva It Solutions Address: 697 / 75,30th Cross About Company: Hyva IT Solutions. * streem pvt Ltd Address: onden road kannur9845672062 About Company: Go to the website * Advanced It Wave Address: Ayyappankavu, Ernakulam (North), About Company: Website Developement Services * Viral Industry Address: vettiyar, kodam p o tramp8469666663 About Company: # for discussion [1] = 0.0008 [2] = 0 ) [2] = Array ( [0] = company [1] = 0.0007 [2] = 1 ) ) here 'garfield calicut Address: tanil nadu chennai0696955666 About Company: re stored. ' is repeated , i need to display that only once

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >