Search Results

Search found 4625 results on 185 pages for 'split tunnel'.

Page 128/185 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • Trim email list into domain list

    - by hanjaya
    The function below is part of a script to trim email list from a file into domain list and removes duplicates. /* define a function that can accept a list of email addresses */ function getUniqueDomains($list) { // iterate over list, split addresses and add domain part to another array $domains = array(); foreach ($list as $l) { $arr = explode("@", $l); $domains[] = trim($arr[1]); } // remove duplicates and return return array_unique($domains); } What does $domains[] = trim($arr[1]); mean? Specifically the $arr[1]. What does [1] mean in this context? How come variable $arr becomes an array variable?

    Read the article

  • improve my python program to fetch the desire rows by using if condition

    - by user2560507
    unique.txt file contains: 2 columns with columns separated by tab. total.txt file contains: 3 columns each column separated by tab. I take each row from unique.txt file and find that in total.txt file. If present then extract entire row from total.txt and save it in new output file. ###Total.txt column a column b column c interaction1 mitochondria_205000_225000 mitochondria_195000_215000 interaction2 mitochondria_345000_365000 mitochondria_335000_355000 interaction3 mitochondria_345000_365000 mitochondria_5000_25000 interaction4 chloroplast_115000_128207 chloroplast_35000_55000 interaction5 chloroplast_115000_128207 chloroplast_15000_35000 interaction15 2_10515000_10535000 2_10505000_10525000 ###Unique.txt column a column b mitochondria_205000_225000 mitochondria_195000_215000 mitochondria_345000_365000 mitochondria_335000_355000 mitochondria_345000_365000 mitochondria_5000_25000 chloroplast_115000_128207 chloroplast_35000_55000 chloroplast_115000_128207 chloroplast_15000_35000 mitochondria_185000_205000 mitochondria_25000_45000 2_16595000_16615000 2_16585000_16605000 4_2785000_2805000 4_2775000_2795000 4_11395000_11415000 4_11385000_11405000 4_2875000_2895000 4_2865000_2885000 4_13745000_13765000 4_13735000_13755000 My program: file=open('total.txt') file2 = open('unique.txt') all_content=file.readlines() all_content2=file2.readlines() store_id_lines = [] ff = open('match.dat', 'w') for i in range(len(all_content)): line=all_content[i].split('\t') seq=line[1]+'\t'+line[2] for j in range(len(all_content2)): if all_content2[j]==seq: ff.write(seq) break Problem: but istide of giving desire output (values of those 1st column that fulfile the if condition). i nead somthing like if jth of unique.txt == ith of total.txt then write ith row of total.txt into new file.

    Read the article

  • Error message in XSLT with C# extension function

    - by iHeartGreek
    Hi! I am received the following error while trying to implement a C# extension function in XSLT. Extension function parameters or return values which have CLR type 'Char[]' are not supported.** code: <xsl:variable name="stringList"> <xsl:value-of select="extension:GetList('AAA BBB CCC', ' ')"/> </xsl:variable> <msxsl:script language="C#" implements-prefix="extension"> <![CDATA[ public string[] GetList(string str, char[] delimiter) { ... ... return str.Split(delimiter, StringSplitOptions.None); } ]]> </msxsl:script> Can someone explain this error message and how to get past it? Thanks!

    Read the article

  • Strange result of highchart

    - by user1612334
    I use http://highcharts.com and there is really strange result. So, my data looks like: Value | Date 1507 2013-02-03 734 2013-02-02 0 2013-02-01 225 2013-01-31 *Graphic miss* 672 2013-01-30 *Graphic miss* 692 2013-01-29 *Graphic miss* <--- This value gone to 1 february 910 2013-01-28 314 2013-01-27 I miss three days (29 January, 30, 31). When i get data from database, i convert it so: var lines = []; try { jQuery.each(data, function(i, line) { var dateArr = line.date.split('-'); lines.push([ Date.UTC(dateArr[0],dateArr[1],dateArr[2]), parseInt(line.num_chips) ]); }); } catch(e) { } Why could it be so? =\

    Read the article

  • Custom constructors for models in Google App Engine (python)

    - by Nikhil Chelliah
    I'm getting back to programming for Google App Engine and I've found, in old, unused code, instances in which I wrote constructors for models. It seems like a good idea, but there's no mention of it online and I can't test to see if it works. Here's a contrived example, with no error-checking, etc.: class Dog(db.Model): name = db.StringProperty(required=True) breeds = db.StringListProperty() age = db.IntegerProperty(default=0) def __init__(self, name, breed_list, **kwargs): db.Model.__init__(**kwargs) self.name = name self.breeds = breed_list.split() rufus = Dog('Rufus', 'spaniel terrier labrador') rufus.put() The **kwargs are passed on to the Model constructor in case the model is constructed with a specified parent or key_name, or in case other properties (like age) are specified. This constructor differs from the default in that it requires that a name and breed_list be specified (although it can't ensure that they're strings), and it parses breed_list in a way that the default constructor could not. Is this a legitimate form of instantiation, or should I just use functions or static/class methods? And if it works, why aren't custom constructors used more often?

    Read the article

  • Java Applet - ArrayIndexOutOfBoundsException

    - by Dan
    OK so I am getting an ArrayIndexOutofBoundsException. I don't know why. Here's my code: http://www.so.pastebin.com/y5MjD1k3 The thing is when I go to the red brick at board[2][2]... I go there. Then I go up... then I TRY go to back down but that error pops up. Also when I go to the right 8 squares... I ALSO get that error. ALSO, pretend my 2d map is split into FOUR squares... well square one is the top left... if I go ANYWHERE else ... I get that error. What am I doing wrong? Thanks.

    Read the article

  • add extra data to response object to render in template

    - by mp0int
    I ned to write a code sniplet that enables to disable connection to some parts of a site. Admin and the mainpage will be displayable, but user section (which uses ajax) will be displayed, but can not be used (vith a transparent div set over the page). Also there is a few pages which will be disabled. my logic is that, i write a middleware, def process_request(self, request): if ayar.tonline_kapali: url_parcalari = request.path.split('/') if url_parcalari[0] not in settings.BAGIMSIZ_URLLER: if not request.is_ajax(): return render_to_response('bakim_modu.html') else: return None that code let me to display a "site closed" message for the urls not in BAGIMSIZ_URLLER (which contains urls that will be accessible) But i do not figure out how can i solve the problem about ajax pages... i need to set a header or something to the response and need to check it in the template.

    Read the article

  • Java BufferedReader behavior in CSV vs TXT file

    - by Gabriel
    If i try to read a CSV file called csv_file.csv. The problem is that when i read lines with BufferedReader.readLine() it skips the first line with months. But when i rename the file to csv_file.txt it reads it allright and it's not skipping the first line. Is there an undocumented "feature" of BufferedReader that i'm not aware? Example of file: Months, SEP2010, OCT2010, NOV2010 col1, col2, col3, col4, col5 aaa,,sdf,"12,456",bla bla bla, xsaffadfafda and so on, and so on, "10,00", xxx, xxx The code: FileInputStream stream = new FileInputStream(UploadSupport.TEMPORARY_FILES_PATH+fileName); BufferedReader br = new BufferedReader(new InputStreamReader(stream, "UTF-8")); String line = br.readLine(); String months[] = line.split(","); while ((line=br.readLine())!=null) { /*parse other lines*/ }

    Read the article

  • mysql varchar innodb page size limit 8100 bytes

    - by David19801
    Hi, Regarding innodb, someone recently told me: "the varchar content beyond 768 bytes is stored in supplemental 16K pages" This is very interesting. If each varchar will be latin1, which I believe stores as 1byte per letter, would a single varchar(500) (<768 bytes) require an extra i/o as a varchar(1000) (768 bytes) would?? (this question is to find out if all varchars or just big varchars are split into a separate page) Is the 768 limit per varchar or for all varchars in the row added together? (for example, does this get optimized - varchar(300), varchar(300), varchar(300): [where each individual varchar column is below 768 but together they are above 768 characters]? I am confused about if the 768 limit relates to each individual varchar or all varchars in the row totaled (as in the question). Any clarification? EDIT: Removed part about CHARS due to finding out about their limit of 255 max.

    Read the article

  • Why are my \n not working in PHP?

    - by Doug
    I'm playing with SAX and noticed it's not line breaking properly. I have no iea why. function flush_data() { global $level, $char_data; $char_data = trim($char_data); if( strlen( $char_data ) > 0 ) { print "\n"; $data = split("\n", wordwrap($char_data, 76 - ($level*2))); foreach($data as $line) { print str_repeat(' ', ($level +1)) . "[".$line."]"."\n"; } } $char_data = ''; }

    Read the article

  • A simple string array Iteration in C# .NET doesn't work

    - by met.lord
    This is a simple code that should return true or false after comparing each element in a String array with a Session Variable. The thing is that even when the string array named 'plans' gets the right attributes, inside the foreach it keeps iterating only over the first element, so if the Session Variable matches other element different than the first one in the array it never returns true... You could say the problem is right there in the foreach cicle, but I cant see it... I've done this like a hundred times and I can't understand what am I doing wrong... Thank you protected bool ValidatePlans() { bool authorized = false; if (RequiredPlans.Length > 0) { string[] plans = RequiredPlans.Split(','); foreach (string plan in plans) { if (MySessionInfo.Plan == plan) authorized = true; } } return authorized; }

    Read the article

  • Help me write a nicer SQL query in Rails

    - by Sainath Mallidi
    Hi, I am trying to write an SQL query to update some of the attributes that are regularly pulled from source. the output will be a text file with the following fields: author, title, date, popularity I have two tables to update one is the author information and the other is popularity table. And the Author Active Record object has one popularity. Currently I'm doing it like this.\ arr.each { |x| x = x.split(" ") results = Author.find_by_sql("SELECT authors.id FROM authors, priorities WHERE authors.id=popularity.authors_id AND authors.author = x[0]") results[0].popularity.update_attribute("popularity", x[3]) I need two tables because the popularity keeps changing, and I need only the top 1000 popular ones, but I still need to keep the previously popular ones also. Is there any nicer way to do this, instead of one query for every new object. Thanks.

    Read the article

  • Regular Expression to match unlimited number of options

    - by Pekka
    I want to be able to parse file paths like this one: /var/www/index.(htm|html|php|shtml) into an ordered array: array("htm", "html", "php", "shtml") and then produce a list of alternatives: /var/www/index.htm /var/www/index.html /var/www/index.php /var/www/index.shtml Right now, I have a preg_match statement that can split two alternatives: preg_match_all ("/\(([^)]*)\|([^)]*)\)/", $path_resource, $matches); Could somebody give me a pointer how to extend this to accept an unlimited number of alternatives (at least two)? Just regarding the regular expression, the rest I can deal with. The rule is: The list needs to start with a ( and close with a ) There must be one | in the list (i.e. at least two alternatives) Any other occurrence(s) of ( or ) are to remain untouched.

    Read the article

  • How to use NGramTokenizerFactory or NGramFilterFactory?

    - by user572485
    Hi, Recently, I am studying how to store and index using Solr. I want to do facet.prefix search. With whitespace tokenizer, "Where are you" will be splited into three words and indexed. If I search facet.prefix="where are", no result will be returned. I google and found NGramFilterFactory can help me. But when I apply this filter factory, I found the result is "w, h, e, ..., wh, ..", which split the sentence by character, not by token word. I use the parameters maxGramSize and minGramSize, set to 1 and 3. Does the NGramFilterFactory work right? Should I add some other parameters? Is there some other filter factories which can help me? Thanks!

    Read the article

  • Multiple sections on landing page only

    - by RDRAO
    Hi every one, I am very much new to Drupal but am loving it to start with. I got struck at a point with respect to theaming. I have a region called 'footer-teaser' just above footer. Its width is 800px. its been split into 3 equal size columns. Each column has the following. Image of size 120x120 Some teaser text Link to 'read more' The design requirement is all the above should be editable by the admin from the admin interface. If its static i would have hardcoded this but since the requirement is dynamic, i am not aware of how to achieve this. I have customised page.tpl for other sections of the page except this. I am sure someone else would have faced this issue before and was wondering if anyone can direct me in the right direction? Even better if an example is provided. Cheers RD

    Read the article

  • Finding if a sentence contains a specific phrase in Ruby

    - by TenJack
    Right now I am seeing if a sentence contains a specific word by splitting the sentence into an array and then doing an include to see if it contains the word. Something like: "This is my awesome sentence.".split(" ").include?('awesome') But I'm wondering what the fastest way to do this with a phrase is. Like if I wanted to see if the sentence "This is my awesome sentence." contains the phrase "my awesome sentence". I am scraping sentences and comparing a very large number of phrases, so speed is somewhat important.

    Read the article

  • MySQL thousands of updates, slowing down.

    - by noryb009
    I need to run a PHP loop for a total of 100, 000 times (about 10, 000 each script-run), and each loop has about 5 MySQL UPDATES to it. When I run the loop 50 times, it takes 3 sec. When I run the loop 1000 times, it takes about 1300 sec. As you can see, MySQL is slowing down ALOT with more UPDATEs. This is an example update: mysql_query("UPDATE table SET `row1`=`row1` +1 WHERE `UniqueValue`='5'"); This is generated randomly from PHP, but I can store it in a variable and run it every n loops. Is there any way to either make MySQL and PHP run at a consistent speed (is PHP storing hidden variables?), or split up the script so they do? Note: I am running this for a development purposes, not for production, so there will only be 1 computer accessing the data.

    Read the article

  • How can I resolve naming conflict in given precompiled libraries?

    - by asm
    I'm linking two different libraries that have functions with exactly same name (it's opengl32.lib and libgles_cm.lib - OpenGL ES emulation under Win32 platform), and I want to be able to specify, which version I'm calling. I'm porting a game to OpenGL ES, and what I want to achieve, is a split-screen rendering, where left side is an OpenGL version, and right side is a ES version. To produce the same result, they will recieve slightly different calls, and I'll be able to visually compare them, effectively finding visual artifacts. It worked perfectly with OpenGL/DirectX at the same window, but now the problem is that both versions imports the functions with the same name, like glDrawArrays, and only one version is imported. Unfortunately, I don't have sources of any of that libraries. Is there a way to... I dont' know, wrap one library into additional namespace before linking (with calls like ES::glDrawArrays), somehow rename some of functions or do anything else? I'm using microsoft compiler now, but if there will be solution with another one (GCC/ICC), I'll switch to it.

    Read the article

  • Linq to SQL - design question.

    - by UshaP
    HI, Currently i have one big datacontex with 35 tables (i dragged all my DB tables to the designer). I must admit it is very comfortable cause i have ORM to my full DB and query with linq is easy and simple. My questions are: 1. Would you consider it bad design to have one datacontext with 35 tables or should i split it to logic units? 2. Is there any performance penalties for using such a big datacontext? Thanks, Pini.

    Read the article

  • MySQL : delete from table that is used in the where clause

    - by Eric
    I am writing a small script to synchronize 2 MySQL tables ( t1 to be 'mirrored' to t2 ) In a step I would like to delete rows inside t2 that has been delete in t1 with the same id. I tried this query : delete from t2 where t2.id in ( select t2.id left join t1 on (t1.id=t2.id) where t1.id is null ) But Mysql forbid me to use t2 in the same time in the delete and in the select (sound logical by the way) Of course, I can split the query into 2 queries : first select IDs, then delete rows with these IDs. My question : do you have a cleaner way to delete row from t2 that does not exist anymore in t1 ? with one query only ?

    Read the article

  • MySQL Import Database Error because of Extended Inserts

    - by Castgame
    Hello all, I'm importing a 400MB(uncompressed) MySQL database. I'm using BIGDUMP, and I am getting this error: Stopped at the line 387. At this place the current query includes more than 300 dump lines. That can happen if your dump file was created by some tool which doesn't place a semicolon followed by a linebreak at the end of each query, or if your dump contains extended inserts. Please read the BigDump FAQs for more infos. I believe the file does contain Extended Inserts, however I have no way to regenerate the database as it has been deleted from the old server. How can I import this database or convert it to be imported? Thanks for any help. Best Nick EDIT: It appears the only viable answer is to separate the extended inserts, but I still need help figuring out how to split the file as the answer below suggests. Please help. Thank you.

    Read the article

  • How to get the computer name (hostname in a web aplication)?

    - by Filipe
    Hi, how can I get the client's computer name in a web application. The user in a network. Regards // Already tryed this option string IP = System.Web.HttpContext.Current.Request.UserHostAddress; string compName = DetermineCompName(IP); System.Net.IPHostEntry teste = System.Net.Dns.GetHostEntry(IP); ssresult = IP + " - " + teste.HostName; // TODO: Write implementation for action private static string DetermineCompName(string IP) { IPAddress myIP = IPAddress.Parse(IP); IPHostEntry GetIPHost = Dns.GetHostEntry(myIP); string[] compName = GetIPHost.HostName.ToString().Split('.'); return compName[0]; } All of that, gives me only the IP :/

    Read the article

  • Condor job using DAG with some jobs needing to run the same host

    - by gurney alex
    I have a computation task which is split in several individual program executions, with dependencies. I'm using Condor 7 as task scheduler (with the Vanilla Universe, due do constraints on the programs beyond my reach, so no checkpointing is involved), so DAG looks like a natural solution. However some of the programs need to run on the same host. I could not find a reference on how to do this in the Condor manuals. Example DAG file: JOB A A.condor JOB B B.condor JOB C C.condor JOB D D.condor PARENT A CHILD B C PARENT B C CHILD D I need to express that B and D need to be run on the same computer node, without breaking the parallel execution of B and C. Thanks for your help.

    Read the article

  • read pair of characters separated by \t c++

    - by Kiran
    Friends, I want to read a pair of characters separated by \t. I want to continue reading the input until user enters z for any of the characters. Here are the options I thought: while (cinch1ch2) { // process ch1 & ch2 } std::string str; while (getline(cin, str) ){ //split string } Also, I want to validate the input to make sure that it is correct. Please suggest the best way. If this is a duplicate, please point me to the right one. Thanks.

    Read the article

  • Forcing A Postback Asp.Net

    - by Nick LaMarca
    Please take a look at the following click event... Protected Sub btnDownloadEmpl_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles btnDownloadEmpl.Click Dim emplTable As DataTable = SiteAccess.DownloadEmployee_H() Dim d As String = Format(Date.Now, "d") Dim ad() As String = d.Split("/") Dim fd As String = ad(0) & ad(1) Dim fn As String = "E_" & fd & ".csv" Response.ContentType = "text/csv" Response.AddHeader("Content-Disposition", "attachment; filename=" & fn) CreateCSVFile(emplTable, Response.Output) Response.Flush() Response.End() lblEmpl.Visible = True End Sub This code simply exports data from a datatable to a csv file. The problem here is lblEmpl.Visible=true never gets hit because this code doesnt cause a postback to the server. Even if I put the line of code lblEmpl.Visible=true at the top of the click event the line executes fine, but the page is never updated. How can I fix this?

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >