Search Results

Search found 4451 results on 179 pages for 'split brain'.

Page 150/179 | < Previous Page | 146 147 148 149 150 151 152 153 154 155 156 157  | Next Page >

  • C# string syntax error

    - by Mesa
    I'm reading in data from a file and trying to write only the word immediately before 'back' in red text. For some reason it is displaying the word and then the word again backwords. Please help. Thank you. private void Form1_Load(object sender, EventArgs e) { Regex r = new Regex(" "); StreamReader sr = new StreamReader("KeyLogger.txt"); string[] tokens = r.Split(sr.ReadToEnd()); int index = 0; for(int i = 0; i <= tokens.Length; i++) { if (tokens[i].Equals("back")) { //richTextBox1.Text+="TRUE"; richTextBox1.SelectionColor = Color.Red; string myText; if (tokens[i - 1].Equals("back")) myText = ""; else myText = tokens[i - 1]; richTextBox1.SelectedText = myText; richTextBox1.Text += myText; } else { //richTextBox1.Text += "NOOOO"; } //index++; //richTextBox1.Text += index; } }

    Read the article

  • How exactly do MbUnit's [Parallelizable] and DegreeOfParallelism work?

    - by BenA
    I thought I understood how MbUnit's parallel test execution worked, but the behaviour I'm seeing differs sufficiently much from my expectation that I suspect I'm missing something! I have a set of UI tests that I wish to run concurrently. All of the tests are in the same assembly, split across three different namespaces. All of the tests are completely independent of one another, so I'd like all of them to be eligible for parallel execution. To that end, I put the following in the AssemblyInfo.cs: [assembly: DegreeOfParallelism(8)] [assembly: Parallelizable(TestScope.All)] My understanding was that this combination of assembly attributes should cause all of the tests to be considered [Parallelizable], and that the test runner should use 8 threads during execution. My individual tests are marked with the [Test] attribute, and nothing else. None of them are data-driven. However, what I actually see is at most 5-6 threads being used, meaning that my test runs are taking longer than they should be. Am I missing something? Do I need to do anything else to ensure that all of my 8 threads are being used by the runner? N.B. The behaviour is the same irrespective of which runner I use. The GUI, command line and TD.Net runners all behave the same as described above, again leading me to think I've missed something. EDIT: As pointed out in the comments, I'm running v3.1 of MbUnit (update 2 build 397). The documentation suggests that the assembly level [parallelizable] attribute is available, but it does also seem to reference v3.2 of the framework despite that not yet being available. EDIT 2: To further clarify, the structure of my assembly is as follows: assembly - namespace - fixture - tests (each carrying only the [Test] attribute) - fixture - tests (each carrying only the [Test] attribute) - namespace - fixture - tests (each carrying only the [Test] attribute) - fixture - tests (each carrying only the [Test] attribute) - namespace - fixture - tests (each carrying only the [Test] attribute) - fixture - tests (each carrying only the [Test] attribute)

    Read the article

  • An online php debugger/code editor

    - by Zirak
    It's a simple deal: I'm sometimes in places where I don't have my laptop, and find myself with spare time and an idea for a project. But unfortunately, I can't do anything about it. I tried a variety of solutions, which include running IDEs (like phpstorm or Aptana) on a disc-on-key or cd (very slow and unappealing), trying several online solutions (like http://phpanywhere.net) and found that all of them are either buggy, overloaded or underloaded with features, just difficult to use, require FTP etc etc. All that is required here is a syntax highlighting and debugging alerts; no actual running of code. So the question is split into two: 1)Do you know of a good online php editor that you've used and enjoyed? 2)If no, then how would you go about making one? The second one seems a bit general, so I'll try and expand...It might be a good idea; if you can't find one, make one. The question is about the concept of making a syntax highlighter (shouldn't be too difficult), and the difficult part of catching php errors WITHOUT executing any php code. Thank you in advance.

    Read the article

  • Moving to an arbitrary position in a file in Python

    - by B Rivera
    Let's say that I routinely have to work with files with an unknown, but large, number of lines. Each line contains a set of integers (space, comma, semicolon, or some non-numeric character is the delimiter) in the closed interval [0, R], where R can be arbitrarily large. The number of integers on each line can be variable. Often times I get the same number of integers on each line, but occasionally I have lines with unequal sets of numbers. Suppose I want to go to Nth line in the file and retrieve the Kth number on that line (and assume that the inputs N and K are valid --- that is, I am not worried about bad inputs). How do I go about doing this efficiently in Python 3.1.2 for Windows? I do not want to traverse the file line by line. I tried using mmap, but while poking around here on SO, I learned that that's probably not the best solution on a 32-bit build because of the 4GB limit. And in truth, I couldn't really figure out how to simply move N lines away from my current position. If I can at least just "jump" to the Nth line then I can use .split() and grab the Kth integer that way. The nuance here is that I don't just need to grab one line from the file. I will need to grab several lines: they are not necessarily all near each other, the order in which I get them matters, and the order is not always based on some deterministic function. Any ideas? I hope this is enough information. Thanks!

    Read the article

  • How to define a batching routing service in WCF

    - by mattx
    I have designed a custom silverlight wcf channel that I want to leverage to selectively batch calls from the client to the server and possibly to cache on the client and short circuit calls. So far I'm just using this channel as a transport and sending the generic WCF messages that result to the WCF router service example here http://msdn.microsoft.com/en-us/magazine/cc500646.aspx?pr=blog to prototype this on the server side. So my scenario looks like this: IFooClient-MyTransportChannel-IRouterService-IFooService-Return I now need to be able to send more than one message per call through the router and carve them up and service them on the server side. Since this is just an experiment and I'm taking baby steps I will dispatch and service all the messages right away on the server side and return the batch of results. Immediately I noticed that simply making the router interface take Message[] instead of Message doesn't work due to serialization problems. I guess this makes sense. I'm not sure soap envelopes can contain other soap envelopes etc. Is there a simple way to take a collection of WCF Message objects and send them to a single method on a service where they can be split up and forwarded as appropriate? If not I'd love suggestions on how I should approach this. I want to have minimal work to do on the router service side so the goal should be to get as close to being able to "slice and forward" as possible.

    Read the article

  • OO model for nsxmlparser when delegate is not self

    - by richard
    Hi, I am struggling with the correct design for the delegates of nsxmlparser. In order to build my table of Foos, I need to make two types of webservice calls; one for the whole table and one for each row. It's essentially a master-query then detail-query, except the master-query-result-xml doesn't return enough information so i then need to query the detail for each row. I'm not dealing with enormous amounts of data. Anyway - previously I've just used NSXMLParser *parser = [[NSXMLParser alloc]init]; [parser setDelegate:self]; [parser parse]; and implemented all the appropriate delegate methods in whatever class i'm in. In attempt at cleanliness, I've now created two separate delegate classes and done something like: NSXMLParser *xp = [[NSXMLParser alloc]init]; MyMasterXMLParserDelegate *masterParserDelegate = [[MyMasterXMLParser]alloc]init]; [xp setDelegate:masterParserDelegate]; [xp parse]; In addition to being cleaner (in my opinion, at least), it also means each of the -parser:didStartElement implementations don't spend most of the time trying to figure out which xml they're parsing. So now the real crux of the problem. Before i split out the delegates, i had in the main class that was also implementing the delegate methods, a class-level NSMutableArray that I would just put my objects-created-from-xml in when -parser:didEndElement found the 'end' of each record. Now the delegates are in separate classes, I can't figure out how to have the -parser:didEndElement in the 'detail' delegate class "return" the created object to the calling class. At least, not in a clean OO way. I'm sure i could do it with all sorts of nasty class methods. Does the question make sense? Thanks.

    Read the article

  • How can my facebook application post message to a wall?

    - by Thomas Dekiere
    i already found out how to post something to a wall with the graph api on behalf of the facebook user. But now i want to post something in the name of my application. Here is how i'm trying to do this: protected void btn_submit_Click(object sender, EventArgs e) { Dictionary<string, string> data = new Dictionary<string, string>(); data.Add("message", "Testing"); // i'll add more data later here (picture, link, ...) data.Add("access_token", FbGraphApi.getAppToken()); FbGraphApi.postOnWall(ConfigSettings.getFbPageId(), data); } FbGraphApi.getAppToken() // ... private static string graphUrl = "https://graph.facebook.com"; //... public static string getAppToken() { MyWebRequest req = new MyWebRequest(graphUrl + "/" + "oauth/access_token?type=client_cred&client_id=" + ConfigSettings.getAppID() + "&client_secret=" + ConfigSettings.getAppSecret(), "GET"); return req.GetResponse().Split('=')[1]; } FbGraphApi.postOnWall() public static void postOnWall(string id, Dictionary<string,string> args) { call(id, "feed", args); } FbGraphApi.call() private static void call(string id, string method, Dictionary<string,string> args ) { string data = ""; foreach (KeyValuePair<string, string> arg in args) { data += arg.Key + "=" + arg.Value + "&"; } MyWebRequest req = new MyWebRequest(graphUrl +"/" + id + "/" + method, "POST", data.Substring(0, data.Length - 1)); req.GetResponse(); // here i get: "The remote server returned an error: (403) Forbidden." } Does anyone see where this i going wrong? I'm really stuck on this. Thanks!

    Read the article

  • PHP regex for password validation

    - by Fabio Anselmo
    I not getting the desired effect from a script. I want the password to contain A-Z, a-z, 0-9, and special chars. A-Z a-z 0-9 2 special chars 2 string length = 8 So I want to force the user to use at least 2 digits and at least 2 special chars. Ok my script works but forces me to use the digits or chars back to back. I don't want that. e.g. password testABC55$$ is valid - but i don't want that. Instead I want test$ABC5#8 to be valid. So basically the digits/special char can be the same or diff - but must be split up in the string. PHP CODE: $uppercase = preg_match('#[A-Z]#', $password); $lowercase = preg_match('#[a-z]#', $password); $number = preg_match('#[0-9]#', $password); $special = preg_match('#[\W]{2,}#', $password); $length = strlen($password) >= 8; if(!$uppercase || !$lowercase || !$number || !$special || !$length) { $errorpw = 'Bad Password';

    Read the article

  • Run command with space characters in bash script

    - by ??iu
    I have a file that contains a list of files: 02 of Clubs.eps 02 of Diamonds.eps 02 of Hearts.eps 02 of Spades.eps ... I am attempting to mass-convert these to png format in several sizes. The script I am using to do this is: while read -r line do for i in 80 35 200 do convert $(sed 's/ /\\ /g' <<< Cards/${line}) -size ${i}x${i} ../img/card/$(basename $(tr ' ' '_' <<< ${line} | tr '[A-Z]' '[a-z]') .eps)_${i}.png; done done < card_list.txt However, this doesn't work, apparently trying to split on each word, resulting in the following error output: convert: unable to open image `Cards/02\': No such file or directory @ error/blob.c/OpenBlob/2514. convert: no decode delegate for this image format `Cards/02\' @ error/constitute.c/ReadImage/532. convert: unable to open image `of\': No such file or directory @ error/blob.c/OpenBlob/2514. convert: no decode delegate for this image format `of\' @ error/constitute.c/ReadImage/532. convert: unable to open image `Clubs.eps': No such file or directory @ error/blob.c/OpenBlob/2514. If I change the convert to an echo the result looks right and if I copy a line and run it myself in the shell it works fine: convert Cards/02\ of\ Clubs.eps -size 80x80 ../img/card/02_of_clubs_80.png convert Cards/02\ of\ Clubs.eps -size 35x35 ../img/card/02_of_clubs_35.png convert Cards/02\ of\ Clubs.eps -size 200x200 ../img/card/02_of_clubs_200.png convert Cards/02\ of\ Diamonds.eps -size 80x80 ../img/card/02_of_diamonds_80.png convert Cards/02\ of\ Diamonds.eps -size 35x35 ../img/card/02_of_diamonds_35.png convert Cards/02\ of\ Diamonds.eps -size 200x200 ../img/card/02_of_diamonds_200.png convert Cards/02\ of\ Hearts.eps -size 80x80 ../img/card/02_of_hearts_80.png convert Cards/02\ of\ Hearts.eps -size 35x35 ../img/card/02_of_hearts_35.png convert Cards/02\ of\ Hearts.eps -size 200x200 ../img/card/02_of_hearts_200.png convert Cards/02\ of\ Spades.eps -size 80x80 ../img/card/02_of_spades_80.png UPDATE: Just adding quotes (see below) has the same result as the above, where I had been using sed to add backslashes convert '"'Cards/${line}'"' -size ${i}x${i} ../img/card/$(basename $(tr ' ' '_' <<< ${line} | tr '[A-Z]' '[a-z]') .eps)_${i}.png; I've tried both double and single quotes

    Read the article

  • Replacing a word in a text file with a value using python

    - by Jamde Jam
    I have been trying to replace a word in a text file with a value (say 1), but my outfile is blank.I am new to python (its only been a month since I have been learning it). My file is relatively large, but I just want to replace a word with the value 1 for now. Here is a segment of what the file looks like: NAME SECOND_1 ATOM 1 6 0 0 0 # ORB 1 ATOM 2 2 0 12/24 0 # ORB 2 ATOM 3 2 12/24 0 0 # ORB 2 ATOM 4 2 0 0 4/24 # ORB 3 ATOM 5 2 0 0 20/24 # ORB 3 ATOM 6 2 0 0 8/24 # ORB 3 ATOM 7 2 0 0 16/24 # ORB 3 ATOM 8 6 0 0 12/24 # ORB 1 ATOM 9 2 12/24 0 12/24 # ORB 2 ATOM 10 2 0 12/24 12/24 # ORB 2 #1 #2 #3 I want to first replace the word ATOM with the value 1. Next I want to replace #ORB with a space. Here is what I am trying thus far. input = open('SECOND_orbitsJ22.txt','r') output=open('SECOND_orbitsJ22_out.txt','w') for line in input: word=line.split(',') if(word[0]=='ATOM'): word[0]='1' output.write(','.join(word)) Can anyone offer any suggestions or help? Thanks so much.

    Read the article

  • Search a string in a file and write the matched lines to another file in Java

    - by Geeta
    For searching a string in a file and writing the lines with matched string to another file it takes 15 - 20 mins for a single zip file of 70MB(compressed state). Is there any ways to minimise it. my source code: getting Zip file entries zipFile = new ZipFile(source_file_name); entries = zipFile.entries(); while (entries.hasMoreElements()) { ZipEntry entry = (ZipEntry)entries.nextElement(); if (entry.isDirectory()) { continue; } searchString(Thread.currentThread(),entry.getName(), new BufferedInputStream (zipFile.getInputStream(entry)), Out_File, search_string, stats); } zipFile.close(); Searching String public void searchString(Thread CThread, String Source_File, BufferedInputStream in, File outfile, String search, String stats) throws IOException { int count = 0; int countw = 0; int countl = 0; String s; String[] str; BufferedReader br2 = new BufferedReader(new InputStreamReader(in)); System.out.println(CThread.currentThread()); while ((s = br2.readLine()) != null) { str = s.split(search); count = str.length - 1; countw += count; //word count if (s.contains(search)) { countl++; //line count WriteFile(CThread,s, outfile.toString(), search); } } br2.close(); in.close(); } -------------------------------------------------------------------------------- public void WriteFile(Thread CThread,String line, String out, String search) throws IOException { BufferedWriter bufferedWriter = null; System.out.println("writre thread"+CThread.currentThread()); bufferedWriter = new BufferedWriter(new FileWriter(out, true)); bufferedWriter.write(line); bufferedWriter.newLine(); bufferedWriter.flush(); } Please help me. Its really taking 40 mins for 10 files using threads and 15 - 20 mins for a single file of 70MB after being compressed. Any ways to minimise the time.

    Read the article

  • Java: which configuration framework to use?

    - by Laimoncijus
    Hi, I need to decide which configuration framework to use. At the moment I am thinking between using properties files and XML files. My configuration needs to have some primitive grouping, e.g. in XML format would be something like: <configuration> <group name="abc"> <param1>value1</param1> <param2>value2</param2> </group> <group name="def"> <param3>value3</param3> <param4>value4</param4> </group> </configuration> or a properties file (something similar to log4j.properties): group.abc.param1 = value1 group.abc.param2 = value2 group.def.param3 = value3 group.def.param4 = value4 I need bi-directional (read and write) configuration library/framework. Nice feature would be - that I could read out somehow different configuration groups as different objects, so I could later pass them to different places, e.g. - reading everything what belongs to group "abc" as one object and "def" as another. If that is not possible I can always split single configuration object into smaller ones myself in the application initialization part of course. Which framework would best fit for me?

    Read the article

  • Word wrap in multiline textbox after 35 characters

    - by Kanavi
    <asp:TextBox CssClass="txt" ID="TextBox1" runat="server" onkeyup="CountChars(this);" Rows="20" Columns="35" TextMode="MultiLine" Wrap="true"> </asp:TextBox> I need to implement word-wrapping in a multi-line textbox. I cannot allow users to write more then 35 chars a line. I am using the following code, which breaks at precisely the specified character on every line, cutting words in half. Can we fix this so that if there's not enough space left for a word on the current line, we move the whole word to the next line? function CountChars(ID) { var IntermediateText = ''; var FinalText = ''; var SubText = ''; var text = document.getElementById(ID.id).value; var lines = text.split("\n"); for (var i = 0; i < lines.length; i++) { IntermediateText = lines[i]; if (IntermediateText.length <= 50) { if (lines.length - 1 == i) FinalText += IntermediateText; else FinalText += IntermediateText + "\n"; } else { while (IntermediateText.length > 50) { SubText = IntermediateText.substring(0, 50); FinalText += SubText + "\n"; IntermediateText = IntermediateText.replace(SubText, ''); } if (IntermediateText != '') { if (lines.length - 1 == i) FinalText += IntermediateText; else FinalText += IntermediateText + "\n"; } } } document.getElementById(ID.id).value = FinalText; $('#' + ID.id).scrollTop($('#' + ID.id)[0].scrollHeight); } Edit - 1 I have to show total max 35 characters in line without specific word break and need to keep margin of two characters from the right. Again, the restriction should be for 35 characters but need space for total 37 (Just for the Visibility issue.)

    Read the article

  • SQL Databases and table design/organization

    - by John McMullen
    (NOOB disclaimer) I'm working on a system (a type of map), that is accessed mostly via 3 fields: ID (auto incremented), X coordinate, and Y coordinate. As it is right now, i have all data on the map, stored in 1 table. Whenever the map display is loaded it simply queries the database for contents in x and y, and the DB gives the data (other fields in the same entry). If an item on the map is doing something, it has a flag saying its doing something, and then has an ID of the action in another table holding that type of 'actions'. Essentially, for all map data, its stored in 1 table. All actions of a certain type are stored in their own table. I'm a noob, and i'm wondering what the most effective/efficient structure for such a design? (a map that has items, and each item has stats/actions). I'm using PHP atm, using standard SQL queries to get my data. Should i split up the tables so that there are only x number of entries on a table? (coord range limits)? Should it just keep growing and growing? There's a lot of queries to the table... so just tryin to see what is best :/

    Read the article

  • How to change value inside a JSON string.

    - by Jeremy Roy
    I have a JSON string array of objects like this. [{"id":"4","rank":"adm","title":"title 1"}, {"id":"2","rank":"mod","title":"title 2"}, {"id":"5","rank":"das","title":"title 3"}, {"id":"1","rank":"usr","title":"title 4"}, {"id":"3","rank":"ref","title":"title 5"}] I want to change the title value of it, once the id is matching. So if my variable myID is 5, I want to change the title "title 5" to new title, and so on. And then I get the new JSON array to $("#rangArray").val(jsonStr); Something like $.each(jsonStr, function(k,v) { if (v==myID) { this.title='new title'; $("#myTextArea").val(jsonStr); } }); Here is the full code. $('img.delete').click(function() { var deltid = $(this).attr("id").split('_'); var newID = deltid[1]; var jsonStr = JSON.stringify(myArray); $.each(jsonStr, function(k,v) { if (v==newID) { // how to change the title jsonStr[k].title = 'new title'; alert(jsonStr); $("#rangArray").val(jsonStr); } }); }); The above is not working. Any help please?

    Read the article

  • one two-directed tcp socket OR two one-directed? (linux, high volume, low latency)

    - by osgx
    Hello I need to send (interchange) a high volume of data periodically with the lowest possible latency between 2 machines. The network is rather fast (e.g. 1Gbit or even 2G+). Os is linux. Is it be faster with using 1 tcp socket (for send and recv) or with using 2 uni-directed tcp sockets? The test for this task is very like NetPIPE network benchmark - measure latency and bandwidth for sizes from 2^1 up to 2^13 bytes, each size sent and received 3 times at least (in teal task the number of sends is greater. both processes will be sending and receiving, like ping-pong maybe). The benefit of 2 uni-directed connections come from linux: http://lxr.linux.no/linux+v2.6.18/net/ipv4/tcp_input.c#L3847 3847/* 3848 * TCP receive function for the ESTABLISHED state. 3849 * 3850 * It is split into a fast path and a slow path. The fast path is 3851 * disabled when: ... 3859 * - Data is sent in both directions. Fast path only supports pure senders 3860 * or pure receivers (this means either the sequence number or the ack 3861 * value must stay constant) ... 3863 * 3864 * When these conditions are not satisfied it drops into a standard 3865 * receive procedure patterned after RFC793 to handle all cases. 3866 * The first three cases are guaranteed by proper pred_flags setting, 3867 * the rest is checked inline. Fast processing is turned on in 3868 * tcp_data_queue when everything is OK. All other conditions for disabling fast path is false. And only not-unidirected socket stops kernel from fastpath in receive

    Read the article

  • Client-server synchronization pattern / algorithm?

    - by tm_lv
    I have a feeling that there must be client-server synchronization patterns out there. But i totally failed to google up one. Situation is quite simple - server is the central node, that multiple clients connect to and manipulate same data. Data can be split in atoms, in case of conflict, whatever is on server, has priority (to avoid getting user into conflict solving). Partial synchronization is preferred due to potentially large amounts of data. Are there any patterns / good practices for such situation, or if you don't know of any - what would be your approach? Below is how i now think to solve it: Parallel to data, a modification journal will be held, having all transactions timestamped. When client connects, it receives all changes since last check, in consolidated form (server goes through lists and removes additions that are followed by deletions, merges updates for each atom, etc.). Et voila, we are up to date. Alternative would be keeping modification date for each record, and instead of performing data deletes, just mark them as deleted. Any thoughts?

    Read the article

  • one two-directed tcp socket of two one-directed? (linux, high volume, low latency)

    - by osgx
    Hello I need to send (interchange) a high volume of data periodically with the lowest possible latency between 2 machines. The network is rather fast (e.g. 1Gbit or even 2G+). Os is linux. Is it be faster with using 1 tcp socket (for send and recv) or with using 2 uni-directed tcp sockets? The test for this task is very like NetPIPE network benchmark - measure latency and bandwidth for sizes from 2^1 up to 2^13 bytes, each size sent and received 3 times at least (in teal task the number of sends is greater. both processes will be sending and receiving, like ping-pong maybe). The benefit of 2 uni-directed connections come from linux: http://lxr.linux.no/linux+v2.6.18/net/ipv4/tcp_input.c#L3847 3847/* 3848 * TCP receive function for the ESTABLISHED state. 3849 * 3850 * It is split into a fast path and a slow path. The fast path is 3851 * disabled when: ... 3859 * - Data is sent in both directions. Fast path only supports pure senders 3860 * or pure receivers (this means either the sequence number or the ack 3861 * value must stay constant) ... 3863 * 3864 * When these conditions are not satisfied it drops into a standard 3865 * receive procedure patterned after RFC793 to handle all cases. 3866 * The first three cases are guaranteed by proper pred_flags setting, 3867 * the rest is checked inline. Fast processing is turned on in 3868 * tcp_data_queue when everything is OK. All other conditions for disabling fast path is false. And only not-unidirected socket stops kernel from fastpath in receive

    Read the article

  • Copy recordset data into multiple sheets to avoid problem of maximum rows limit in Excel VBA

    - by Sam
    I am developing reporting application in Excel/vba 2003. VBA code sends search query to database and gets the data through recordset. It will then be copied to one of excel sheet. The rertrieved data looks like as shown below. ProductID--|---DateProcessed--|----State----- 1................|.. 1/1/2010..............|.....Picked Up 1................|.. 1/1/2010..............|.....Forward To Approver 1................|.. 1/2/2010..............|.....Approver Picked Up 1................|.. 1/3/2010..............|.....Approval Completed 2................|.. 1/1/2010..............|.....Picked Up 3................|.. 1/2/2010..............|.....Picked Up 3................|.. 1/2/2010..............|.....Forward To Approver The problem is data retrieved from search query is so huge that it goes above the excel row limit (65536 rows in excel 2003). So I want to split this data into two excel sheets. While spliting the data I want to ensure that the data for same product shoud remain in one sheet. For example, if the last record in the above result set is 65537th record then I also want to move all records for product 3 into new sheet. So sheet1 will contain records for product id 1 and 2 with total records = 65534. Sheet 2 will cotain records for product id 3 - with total records = 2. How can I acheive this in vba? If it is not possible, is there any alternative solution ? Thanks in Advance !

    Read the article

  • JQuery autocomplete problem

    - by heffaklump
    Im using JQuerys Autocomplete plugin, but it doesn't autocomplete upon entering anything. Any ideas why it doesnt work? The basic example works, but not mine. var ppl = {"ppl":[{"name":"peterpeter", "work":"student"}, {"name":"piotr","work":"student"}]}; var options = { matchContains: true, // So we can search inside string too minChars: 2, // this sets autocomplete to begin from X characters dataType: 'json', parse: function(data) { var parsed = []; data = data.ppl; for (var i = 0; i < data.length; i++) { parsed[parsed.length] = { data: data[i], // the entire JSON entry value: data[i].name, // the default display value result: data[i].name // to populate the input element }; } return parsed; }, // To format the data returned by the autocompleter for display formatItem: function(item) { return item.name; } }; $('#inputplace').autocomplete(ppl, options); Ok. Updated: <input type="text" id="inputplace" /> So, when entering for example "peter" in the input field. No autocomplete suggestions appear. It should give "peterpeter" but nothing happens. And one more thing. Using this example works perfectly. var data = "Core Selectors Attributes Traversing Manipulation CSS Events Effects Ajax Utilities".split(" "); $("#inputplace").autocomplete(data);

    Read the article

  • Visual Studio / Blend... how you organize that?

    - by TomTom
    Virst time more complex stuff in WPF. I am a little lost on the split betwen VS and Blend. It seems I am VERY limited with editors in Visual Studio for editing controls - when customizing, for example, it seems I Can enter astyle in XML... but in blend I Can tell it to make a copy of the CURRENT style and use that as a starter, definitely more convenient. I understand the "difference in focus", but it seems to me that i Really need both tools to work, especially if the controls I Do are: More complex Not user controls "on purpose" (to allow more customization by programmes using the application). THis means when I do a control, my approach would be: Work on the backend as good as it gets without front end (i.e. implement all methods needed etc., but can be dummies) Switch over to Blend (closing visual studio - as the projcet must be closed) Put in the initial templating Switch over to VIsual Studio (closing blend) Put logic in and debug. This seems pretty counterintuitively. Am I missing something obvious here?

    Read the article

  • Need an Overview of Possibilities for multicolumn programming

    - by Sam
    Hi folks, From source1 and source2 i gather that IE9 will NOT support multi-column css3!! Since it is still the most popular browser (another thing i cannot understand), i am left but no other choice than to use Programming Power to make multi-columns work. Now, I use three divs that float to left, and which are manually filled with text. Please don't laugh i know its stupid! But I would wish to not to have to worry about the columns and just have a one piece of (un-interrupted) text which all goes into only 1 div, and then have a program smart enough to split it up into X equally wide columns. Question: before i start reinvent the wheel, what methods of programming power have you known that tackle this elegantly? Please suggest your best working multi-column layout sources so I can evaluate which option is the best (I will update the below table). Exploring all possibilities 2011 and further, to enable multi column text user experience: Language Author SourceCodeUsage WorksOnAllMajorBrowser? ================================================================================= html manual labour put text manually in separate left-floating divs "Y" // Upside: control! Downside: few changes necessitates to reflow 3 divs manually! CSS3 w3c css3.info/preview/multi-column-layout/ "N" // {-moz-column-count: 3; -webkit-column-count: 3; } Thats all! javascript a list apart will add url soon ? // php ? ? ? //

    Read the article

  • Uva's 3n+1 problem

    - by dmindreader
    I'm solving Uva's 3n+1 problem and I don't get why the judge is rejecting my answer. The time limit hasn't been exceeded and the all test cases I've tried have run correctly so far. import java.io.*; public class NewClass{ /** * @param args the command line arguments */ public static void main(String[] args) throws IOException { int maxCounter= 0; int input; int lowerBound; int upperBound; int counter; int numberOfCycles; int maxCycles= 0; int lowerInt; BufferedReader consoleInput = new BufferedReader(new InputStreamReader(System.in)); String line = consoleInput.readLine(); String [] splitted = line.split(" "); lowerBound = Integer.parseInt(splitted[0]); upperBound = Integer.parseInt(splitted[1]); int [] recentlyused = new int[1000001]; if (lowerBound > upperBound ) { int h = upperBound; upperBound = lowerBound; lowerBound = h; } lowerInt = lowerBound; while (lowerBound <= upperBound) { counter = lowerBound; numberOfCycles = 0; if (recentlyused[counter] == 0) { while ( counter != 1 ) { if (recentlyused[counter] != 0) { numberOfCycles = recentlyused[counter] + numberOfCycles; counter = 1; } else { if (counter % 2 == 0) { counter = counter /2; } else { counter = 3*counter + 1; } numberOfCycles++; } } } else { numberOfCycles = recentlyused[counter] + numberOfCycles; counter = 1; } recentlyused[lowerBound] = numberOfCycles; if (numberOfCycles > maxCycles) { maxCycles = numberOfCycles; } lowerBound++; } System.out.println(lowerInt +" "+ upperBound+ " "+ (maxCycles+1)); } }

    Read the article

  • Good way to identify similar images?

    - by Nick
    I've developed a simple and fast algorithm in PHP to compare images for similarity. Its fast (~40 per second for 800x600 images) to hash and a unoptimised search algorithm can go through 3,000 images in 22 mins comparing each one against the others (3/sec). The basic overview is you get a image, rescale it to 8x8 and then convert those pixels for HSV. The Hue, Saturation and Value are then truncated to 4 bits and it becomes one big hex string. Comparing images basically walks along two strings, and then adds the differences it finds. If the total number is below 64 then its the same image. Different images are usually around 600 - 800. Below 20 and extremely similar. Are there any improvements upon this model I can use? I havent looked at how relevant the different components (hue, saturation and value) are to the comparison. Hue is probably quite important but the others? To speed up searches I could probably split the 4 bits from each part in half, and put the most significant bits first so if they fail the check then the lsb doesnt need to be checked at all. I dont know a efficient way to store bits like that yet still allow them to be searched and compared easily. I've been using a dataset of 3,000 photos (mostly unique) and there havent been any false positives. Its completely immune to resizes and fairly resistant to brightness and contrast changes.

    Read the article

  • How to count the Chinese word in a file using regex in perl?

    - by Ivan
    I tried following perl code to count the Chinese word of a file, it seems working but not get the right thing. Any help is greatly appreciated. The Error message is Use of uninitialized value $valid in concatenation (.) or string at word_counting.pl line 21, <FILE> line 21. Total things = 125, valid words = which seems to me the problem is the file format. The "total thing" is 125 that is the string number (125 lines). The strangest part is my console displayed all the individual Chinese words correctly without any problem. The utf-8 pragma is installed. #!/usr/bin/perl -w use strict; use utf8; use Encode qw(encode); use Encode::HanExtra; my $input_file = "sample_file.txt"; my ($total, $valid); my %count; open (FILE, "< $input_file") or die "Can't open $input_file: $!"; while (<FILE>) { foreach (split) { #break $_ into words, assign each to $_ in turn $total++; next if /\W|^\d+/; #strange words skip the remainder of the loop $valid++; $count{$_}++; # count each separate word stored in a hash ## next comes here ## } } print "Total things = $total, valid words = $valid\n"; foreach my $word (sort keys %count) { print "$word \t was seen \t $count{$word} \t times.\n"; } ##---Data---- sample_file.txt ??????,???????,????.??????.????:"?????????????,??????,????????.????????,?????????, ???????????.????????,???????????,??????,??????.???:`??,???????????.'?????, ??????????."??????,??????.????.???, ????????????,????,??????,?????????,??????????????. ????????,??????,???????????,????????,????????.????,????,???????, ??????????,??????,????????.??????.

    Read the article

< Previous Page | 146 147 148 149 150 151 152 153 154 155 156 157  | Next Page >