Search Results

Search found 4167 results on 167 pages for 'rman 11gr2 duplicate'.

Page 27/167 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • Bash loop command until file contains n duplicate entries (lines)

    - by Andrew
    Hello, I'm writing a script and I need to create a loop that will execute same commands until file does contain a specified number of duplicate entries. For example, with each loop I will echo random string to file results. And I want loop to stop when there are 10 lines of of the same string. I thought of something like while [ `some command here (maybe using uniq)` -lt 10 ] do command1 command2 command3 done Do you have any idea how can this problem be solved? Using grep can't be done since I don't know what string I need to look for. Thank you for your suggestions.

    Read the article

  • Duplicate ID/indexes and looping

    - by Justin Alexander
    I realize having two elements in the same html doc with the same ID is wrong, bad, immoral, and will lead to global warming. But... I'm trying to write an XSS widgit, so I really have no control over the quality of the parent web page. I loop through document.images to retrieve a list of images on the page. I perform an action on each one. for(img in document.images){ ... } i've also tried for(var i=0;i<document.images.length;i++){ ... } in both cases it allows me to loop through all of the elements, BUT when trying trying to reference an object with a duplicate ID, I always get the first (in order of the html). When using debugger in IE8 i'm able to see that both elements ARE listed, but that they both have the same index (in IE the index of the document.images is either sequential or matches the image ID) Does anyone have a better solution?

    Read the article

  • NHibernate Linq - Duplicate Records

    - by adegiamb
    I am having a problem with duplicate blog post coming back when i run the linq statement below. The issue that a blog post can have the same tag more then once and that's causing the problem. I know when you use criteria you can do the followingcriteria.SetResultTransformer(new DistinctRootEntityResultTransformer()); How can I do the same thing with linq? List<BlogPost> result = (from blogPost in _session.Linq<BlogPost>() from tags in blogPost.Tags where tags.Tag == tag && blogPost.IsPublished && blogPost.Slug != slugToExclude orderby blogPost.DateCreated descending select blogPost).Distinct() .Skip(recordsToSkip).Take(pageSize).ToList();

    Read the article

  • How to search for duplicate values in a huge text file having around Half Million records

    - by Shibu
    I have an input txt file which has data in the form of records (each row is a record and represents more or less like a DB table) and I need to find for duplicate values. For example: Rec1: ACCOUNT_NBR_1*NAME_1*VALUE_1 Rec2: ACCOUNT_NBR_2*NAME_2*VALUE_2 Rec3: ACCOUNT_NBR_1*NAME_3*VALUE_3 In the above set, the Rec1 and Rec2 are considered to be duplicates as the ACCOUNT NUMBERS are same(ACCOUNT_NBR1). Note: The input file shown above is a delimiter type file (the delimiter being *) however the file type can also be a fixed length file where each column starts and ends with a specified positions. I am currently doing this with the following logic: Loop thru each ACCOUNT NUMBER Loop thru each line of the txt file and record and check if this is repeated. If repeated record the same in a hashtable. End End And I am using 'Pattern' & 'BufferedReader' java API's to perform the above task. But since it is taking a long time, I would like to know a better way of handling it. Thanks, Shibu

    Read the article

  • postgresql duplicate table names best practice

    - by veilig
    My company has a handful of apps that we deploy in the websites we build. Recently a very old app needed to be included along side a newer app and there was a conflict w/ a duplicate table name needed to be used by both apps. We are now in the process of updating an old app and there will be some DB updates. I'm curious what people consider best practice (or how do you do it) to help ensure these name collisions don't happen. I've looked at schema's but not sure if thats the right path we want to take. As the documentation prescribes, I don't want to "wire" a particular schema name into an application and if I add schema's to the user search path how would it know which table I was referring to if two schema's have the same table name. although, maybe I'm reading to much into this. Any insights or words of wisdom would be greatly appreciated!

    Read the article

  • Code Golf: Duplicate Character Removal in String

    - by Alex
    The challenge: The shortest code, by character count, that detects and removes duplicate characters in a String. Removal includes ALL instances of the duplicated character (so if you find 3 n's, all three have to go), and original character order needs to be preserved. Example Input 1: nbHHkRvrXbvkn Example Output 1: RrX Example Input 2: nbHHkRbvnrXbvkn Example Output 2: RrX (the second example removes letters that occur three times; some solutions have failed to account for this) (This is based on my other question where I needed the fastest way to do this in C#, but I think it makes good Code Golf across languages.)

    Read the article

  • Create "duplicate copy" of github repository

    - by user1483934
    I see answers to similar questions and I'm sure I may just not be familiar enough with Git and Github terminology to know if they apply to my question. What I need to do is to clone an existing Github remote repository (a private repo under another person's username that I have contributor access to) and create a new private remote repo under my account. The existing repo user is going to make significant alterations to the repo, delete, and re-push, before they do that they want me to clone and create a duplicate so we can continue working from the repo under my user. I want to preserve the commit history with the repo if possible. I've cloned locally but don't can't seem to figure out how to push it to a new remote that isn't origined to the original user.

    Read the article

  • Duplicate file descriptor after popen

    - by alaamh
    I am using popen to execute a command under linux, then 4 process wile use the same output. I am trying to duplicate the file descriptor again to pass it to each process. here is my code: FILE* file_source = (FILE*) popen(source_command, "r"); int fd = fileno(file_source); fdatasync(fd); int fd[4],y, total = 4 ; for (y = 0; y < total; y++) { dest_fd[y] = dup(fd); } actually if total set to 1 it work fin, after changing total = 4 it does not work anymore.

    Read the article

  • c#.net id duplicate reported untruthfully

    - by tracer tong
    I am getting an error in visual studio that appears to be flat out wrong. Here it is: The tag contains duplicate 'ID' attributes. on the following line: Rica.Yoodul.MiniSites.BaseMiniSiteUserControl ucMaster = (Rica.Yoodul.MiniSites.BaseMiniSiteUserControl)Page.LoadControl("~/MiniSites/" + templateName + "/Master.ascx"); I have checked though both the master and child pages for repeats of the id I last added (the only one added since last successful test) and the ID is used once across both files. Other than controls (e.g ), is there anything this error refers to? Or is there a workaround? I have checked I am editing the right files. On the face of it this appears to be a visual studio bug, but I want to be sure.

    Read the article

  • Does duplicate id's screw up jquery selectors?

    - by Matt
    If I had two divs, both with id="myDiv" Would $("#myDiv").fadeOut(); fade both divs out? Or would it fade only the first/second? Or none at all? How do I change which one it fades out? Note: I know duplicate id's is against standards but I'm using the fancybox modal popup and it duplicates specified content on your page for the content of the popup. If anyone knows a way around this (maybe I'm using fancybox wrong) please let me know.

    Read the article

  • function to remove duplicate characters in a string

    - by Codenotguru
    The following code is trying to remove any duplicate characters in a string.Iam not sure if the code is right??Can anybody help me with the working of the code i.e whats actually happening when there is a match in characters? public static void removeDuplicates(char[] str) { if (str == null) return; int len = str.length; if (len < 2) return; int tail = 1; for (int i = 1; i < len; ++i) { int j; for (j = 0; j < tail; ++j) { if (str[i] == str[j]) break; } if (j == tail) { str[tail] = str[i]; ++tail; } } str[tail] = 0; }

    Read the article

  • Duplicate array but maintain pointer links

    - by St. John Johnson
    Suppose I have an array of nodes (objects). I need to create a duplicate of this array that I can modify without affecting the source array. But changing the nodes will affect the source nodes. Basically maintaining pointers to the objects instead of duplicating their values. // node(x, y) $array[0] = new node(15, 10); $array[1] = new node(30, -10); $array[2] = new node(-2, 49); // Some sort of copy system $array2 = $array; // Just to show modification to the array doesn't affect the source array array_pop($array2); if (count($array) == count($array2)) echo "Fail"; // Changing the node value should affect the source array $array2[0]->x = 30; if ($array2[0]->x == $array[0]->x) echo "Goal"; What would be the best way to do this?

    Read the article

  • javascript: what are immediate functions used for [duplicate]

    - by tkoomzaaskz
    This question already has an answer here: Why using self executing function in JavaScript? [duplicate] 4 answers I've been programming in JS since some time, but I have never came upon a need of using immediate functions, for example: (function(){ console.log('hello, I am an immediate function'); }()) What would be the difference if I just wrote: console.log('hello, I am an immediate function'); ? I don't have any access to this function anyway (it is not assigned anywhere). I think (but I'm not sure) that I can implement everything without immediate functions - so why do people use it?

    Read the article

  • Insert Statment with Case for avoid duplicate record insertion

    - by rama
    I have written the below SP for Precheck for Duplicate records before insert into Table . but it is not allow me yo write insert staement inside the CASE . how can I write Stored Procedure for fist Check the value @Ordername into table After that if it is not present then it should inserted into Database . CREATE PROCEDURE [Test Procedure ] ( @section varchar(70), @mark varchar(70), @qty decimal(18,2), @Weight decimal(18,2), @dateupdateremark int, @OrderName varchar(70) ) AS BEGIN SET NOCOUNT ON; select case(@OrderName) when (select OrderName from dbo.tbl_insertxmldetails where(@OrderName) not in (select OrderName from tbl_insertxmldetails)) then insert into dbo.tbl_insertxmldetails (Section, Mark, QTY,Weight,Dateupdateremark ,OrderName,SystemDate) values (@Section, @Mark, @QTY,@Weight, @Dateupdateremark,@OrderName,GETDATE()) else 'File already Exists' end

    Read the article

  • Socket.io Duplicate clients on a namespace

    - by Servernumber
    Hi I'm trying to use dynamic namespace to create them on demand. It's working except that I get duplicate or more client for some reason Server side : io.of("/" + group ).on("connection", function(socket_group) { socket_group.groupId = group; socket_group.on("infos", function(){ console.log("on the group !"); }) socket_group.on('list', function () { Urls.find({'group' : '...'}).sort({ '_id' : -1 }).limit(10).exec(function(err, data){ socket_group.emit('links', data); }); }) [...] }) Client Side : socket.emit('list', { ... }); On the client side only one command is sent but the server is always responding with 2 or more responses. Every time I close/open my app the response is incremented. Thanks if you find out.

    Read the article

  • SVN moving duplicate directory structures onto one another

    - by Nash0
    I have duplicate directory structures in two locations that I need to merge together in an svn repository. By "merge" I mean I want all files and folder that are unique to structure b to be moved into structure a. When I try to do this using svn move I get the error svn: Path 'com' already exists The folders look like this: src -> com -> (many more files and directories) -> java -> com -> (some files and folders, some folders overlap but all files are unique) src\com is a and src\java\com is b.

    Read the article

  • C#: Using regular expression (Regex) to duplicate a specific character in a string

    - by user3703944
    Anyone know how to use regex to duplicate a specific character in a string? I have a path that is entered like this: C:/Example/example I would like to use regex (or any other method) to display it like this: C://Example//example Is it possible? This is where I'm getting the file path private void btnSearchImage_Click_1(object sender, EventArgs e) { OpenFileDialog ofd = new OpenFileDialog(); ofd.Filter = "Image Files(*.jpg; *.jpeg; *.gif; *.bmp)|*.jpg; *.jpeg; *.gif; *.bmp"; if (ofd.ShowDialog() == System.Windows.Forms.DialogResult.OK) { string filenName = ofd.FileName; pictureBox1.Image = new Bitmap(filenName); string path = filenName; txtimgPath.Text = path; } } Thanks

    Read the article

  • list boxes(A,B) with some duplicate values

    - by Dev
    I have two list boxes(A,B) with some duplicate values and i can send values from A to B or B to A using send button and i have one save button. For the first time without sending any list box if i click on save button i am showing message like "NO changes or Done" But once i send one item from A to B and again sending that same item from B to A it means no changes or Done. here also i want to show same message "NO changes or Done" .but i am unable to find the staus can any one pleasse give code or tips to find the default status for listboxs in javascript. Thanks

    Read the article

  • Traversing ORM relationships returns duplicate results

    - by NKing253
    I have 4 tables -- store, catalog_galleries, catalog_images, and catalog_financials. When I traverse the relationship from store --> catalog_galleries --> catalog_images in other words: store.getCatalogGallery().getCatalogImages() I get duplicate records. Does anyone know what could be the cause of this? Any suggestions on where to look? The store table has a OneToOne relationship with catalog_galleries which in turn has a OneToMany relationship with catalog_images and an eager fetch type. The store table also has a OneToMany relationship with catalog_financials.

    Read the article

  • ASP.NET MVC2 Radio Button generates duplicate HTML id-s

    - by Dmitriy Nagirnyak
    Hi, It seems that the default ASP.NET MVC2 Html helper generates duplicate HTML IDs when using code like this (EditorTemplates/UserType.ascx): <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<UserType>" %> <%: Html.RadioButton("", UserType.Primary, Model == UserType.Primary) %> <%: Html.RadioButton("", UserType.Standard, Model == UserType.Standard) %> <%: Html.RadioButton("", UserType.ReadOnly, Model == UserType.ReadOnly) %> The HTML it produces is: <input checked="checked" id="UserType" name="UserType" type="radio" value="Primary" /> <input id="UserType" name="UserType" type="radio" value="Standard" /> <input id="UserType" name="UserType" type="radio" value="ReadOnly" /> That clearly shows a problem. So I must be misusing the Helper or something. I can manually specify the id as html attribute but then I cannot guarantee it will be unique. So the question is how to make sure that the IDs generated by RadioButton helper are unique for each value and still preserve the conventions for generating those IDs (so nested models are respected? (Preferably not generating IDs manually.) Thanks, Dmitriy,

    Read the article

  • How to prevent duplicate records being inserted with SqlBulkCopy when there is no primary key

    - by kscott
    I receive a daily XML file that contains thousands of records, each being a business transaction that I need to store in an internal database for use in reporting and billing. I was under the impression that each day's file contained only unique records, but have discovered that my definition of unique is not exactly the same as the provider's. The current application that imports this data is a C#.Net 3.5 console application, it does so using SqlBulkCopy into a MS SQL Server 2008 database table where the columns exactly match the structure of the XML records. Each record has just over 100 fields, and there is no natural key in the data, or rather the fields I can come up with making sense as a composite key end up also having to allow nulls. Currently the table has several indexes, but no primary key. Basically the entire row needs to be unique. If one field is different, it is valid enough to be inserted. I looked at creating an MD5 hash of the entire row, inserting that into the database and using a constraint to prevent SqlBulkCopy from inserting the row,but I don't see how to get the MD5 Hash into the BulkCopy operation and I'm not sure if the whole operation would fail and roll back if any one record failed, or if it would continue. The file contains a very large number of records, going row by row in the XML, querying the database for a record that matches all fields, and then deciding to insert is really the only way I can see being able to do this. I was just hoping not to have to rewrite the application entirely, and the bulk copy operation is so much faster. Does anyone know of a way to use SqlBulkCopy while preventing duplicate rows, without a primary key? Or any suggestion for a different way to do this?

    Read the article

  • JPA merge fails due to duplicate key

    - by wobblycogs
    I have a simple entity, Code, that I need to persist to a MySQL database. public class Code implements Serializable { @Id private String key; private String description; ...getters and setters... } The user supplies a file full of key, description pairs which I read, convert to Code objects and then insert in a single transaction using em.merge(code). The file will generally have duplicate entries which I deal with by first adding them to a map keyed on the key field as I read them in. A problem arises though when keys differ only by case (for example: XYZ and XyZ). My map will, of course, contain both entries but during the merge process MySQL sees the two keys as being the same and the call to merge fails with a MySQLIntegrityConstraintViolationException. I could easily fix this by uppercasing the keys as I read them in but I'd like to understand exactly what is going wrong. The conclusion I have come to is that JPA considers XYZ and XyZ to be different keys but MySQL considers them to be the same. As such when JPA checks its list of known keys (or does whatever it does to determine whether it needs to perform an insert or update) it fails to find the previous insert and issuing another which then fails. Is this corrent? Is there anyway round this other than better filtering the client data? I haven't defined .equals or .hashCode on the Code class so perhaps this is the problem.

    Read the article

  • NHibernate returning duplicate object in child collections when using Fetch

    - by UpTheCreek
    When doing a query like this (using Nhibernate 2.1.2): ICriteria criteria = session.CreateCriteria<MyRootType>() .SetFetchMode("ChildCollection1", FetchMode.Eager) .SetFetchMode("ChildCollection2", FetchMode.Eager) .Add(Restrictions.IdEq(id)); I am getting multiple duplicate objects in some cartesian fashion. E.g. if ChildCollection1 has 3 elements, and ChildColection2 has 2 elements then I get results with each element in ChildColection1 one duplicated, and each element in ChildColection2 triplicated! This was a bit of a WTF moment for me... So how to do this correctly? Is using SetFetchMode like this only supported when specifying one collection? Am I just using it wrong (I've seen some references to results transformers, but imagined this would be simplier). Is this something that's different in NH3? Update: As per Felice's suggestion, I tried using the DistinctRootEntity transformer, but this is still returning duplicates. Code: ICriteria criteria = session.CreateCriteria<MyRootType>() .SetFetchMode("ChildCollection1", FetchMode.Eager) .SetFetchMode("ChildCollection2", FetchMode.Eager) .Add(Restrictions.IdEq(id)); criteria.SetResultTransformer(Transformers.DistinctRootEntity); return criteria.UniqueResult<MyRootType>();

    Read the article

  • Duplicate method 'ProcessRequest' in ASPX

    - by Mauricio Scheffer
    I'm trying to code ASP.NET MVC views (WebForms view engine) in F#. I can already write regular ASP.NET WebForms ASPX and it works ok, e.g. <%@ Page Language="F#" %> <% for i in 1..2 do %> <%=sprintf "%d" i %> so I assume I have everything in my web.config correctly set up. However, when I make the page inherit from ViewPage: <%@ Page Language="F#" Inherits="System.Web.Mvc.ViewPage" %> I get this error: Compiler Error Message: FS0442: Duplicate method. The abstract method 'ProcessRequest' has the same name and signature as an abstract method in an inherited type. The problem seems to be this piece of code generated by the F# CodeDom provider: [<System.Diagnostics.DebuggerNonUserCodeAttribute>] abstract ProcessRequest : System.Web.HttpContext -> unit [<System.Diagnostics.DebuggerNonUserCodeAttribute>] default this.ProcessRequest (context:System.Web.HttpContext) = let mutable context = context base.ProcessRequest(context) |> ignore when I change the Page directive to use C# instead, the generated code is: [System.Diagnostics.DebuggerNonUserCodeAttribute()] public new virtual void ProcessRequest(System.Web.HttpContext context) { base.ProcessRequest(context); } which of course works fine and AFAIK is not semantically the same as the generated F# code. I'm using .NET 4.0.30319.1 (RTM) and MVC 2 RTM

    Read the article

  • How can I remove duplicate nodes in XQuery?

    - by Brabster
    I have an XML document I generate on the fly, and I need a function to eliminate any duplicate nodes from it. My function looks like: declare function local:start2() { let $data := local:scan_books() return <books>{$data}</books> }; Sample output is: <books> <book> <title>XML in 24 hours</title> <author>Some Guy</author> </book> <book> <title>XML in 24 hours</title> <author>Some Guy</author> </book> </books> I want just the one entry in my books root tag, and there are other tags, like say pamphlet in there too that need to have duplicates removed. Any ideas? Updated following comments. By unique nodes, I mean remove multiple occurrences of nodes that have the exact same content and structure.

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >