Search Results

Search found 33650 results on 1346 pages for 'line break'.

Page 191/1346 | < Previous Page | 187 188 189 190 191 192 193 194 195 196 197 198  | Next Page >

  • How to design a command line program and keep it open for a future GUI?

    - by systempuntoout
    What are some best practices to keep in mind when developing a script program that could be integrated with a GUI, probably by somebody else, in the future? Example scenario: i develop a fancy python CLI program that scrapes every unicorn images from the web i decide to publish it on github a unicorn fan programmer decide to take the sources and build a GUI on them. he gives up because my code is a mess How do i avoid step 4 and let unicorn fan programmer build his GUI without hassle?

    Read the article

  • Haystack / Whoosh Index Generation Error

    - by Keith Fitzgerald
    I'm trying to setup haystack with whoosh backend. When i try to gen the index [or any index command for that matter] i receive: TypeError: Item in ``from list'' not a string if i completely remove my search_indexes.py i get the same error [so i'm guessing it can't find that file at all] what might cause this error? it's set to autodiscover and i'm sure my app is installed because i'm currently using it. Full traceback: Traceback (most recent call last): File "./manage.py", line 17, in <module> execute_manager(settings) File "/Users/ghostrocket/Development/Redux/.dependencies/django/core/management/__init__.py", line 362, in execute_manager utility.execute() File "/Users/ghostrocket/Development/Redux/.dependencies/django/core/management/__init__.py", line 303, in execute self.fetch_command(subcommand).run_from_argv(self.argv) File "/Users/ghostrocket/Development/Redux/.dependencies/django/core/management/__init__.py", line 257, in fetch_command klass = load_command_class(app_name, subcommand) File "/Users/ghostrocket/Development/Redux/.dependencies/django/core/management/__init__.py", line 67, in load_command_class module = import_module('%s.management.commands.%s' % (app_name, name)) File "/Users/ghostrocket/Development/Redux/.dependencies/django/utils/importlib.py", line 35, in import_module __import__(name) File "/Users/ghostrocket/Development/Redux/.dependencies/haystack/__init__.py", line 124, in <module> handle_registrations() File "/Users/ghostrocket/Development/Redux/.dependencies/haystack/__init__.py", line 121, in handle_registrations search_sites_conf = __import__(settings.HAYSTACK_SITECONF) File "/Users/ghostrocket/Development/Redux/website/../website/search_sites.py", line 2, in <module> haystack.autodiscover() File "/Users/ghostrocket/Development/Redux/.dependencies/haystack/__init__.py", line 83, in autodiscover app_path = __import__(app, {}, {}, [app.split('.')[-1]]).__path__ TypeError: Item in ``from list'' not a string and here is my search_indexes.py from haystack import indexes from haystack import site from myproject.models import * site.register(myobject)

    Read the article

  • Parsing xml files locally from assets folder using XmlPullParser

    - by Randolphg
    Im trying to parse a local xml file that I place in my assets folder. I've been trying to do this for almost a week now. Here is my test xml file Test1 Test2 Test3 Test4 Test5 I keep getting the same error: W/System.err(22458): org.xmlpull.v1.XmlPullParserException: unexpected type (position:TEXT Code: public void xmlParser() throws XmlPullParserException, IOException, ParserConfigurationException, SAXException { Log.d("tag", "xmlParsing...."); Arithmetic arthm = new Arithmetic(); XmlPullParserFactory xmlPF = XmlPullParserFactory.newInstance(); xmlPF.setValidating(false); XmlPullParser xml = xmlPF.newPullParser(); InputStream raw = getApplication().getAssets().open("menu.xml"); xml.setInput(raw, null); xml.nextTag(); Log.d("tag", "start parsing...."); String elementText = null; String elemName = null; int nofTags = 0; while (xml.getEventType() != XmlPullParser.END_DOCUMENT) { Log.d("tag", "while(xml.next)..."); switch (xml.getEventType()) { case XmlPullParser.START_DOCUMENT: Log.d("tag", "while (xml.getEventType() != XmlPullParser.END_DOCUMENT)"); break; case XmlPullParser.START_TAG: Log.d("tag", " case XmlPullParser.START_TAG"); elementText = xml.getName(); Log.d("tag", "elementText = " + elementText); if (xml.getEventType() != XmlPullParser.END_TAG) { xml.nextTag(); } break; case XmlPullParser.TEXT: Log.d("tag", "case TEXT"); if (elementText.equals("menu") && xml.isWhitespace()) { Log.d("tag", "<" + elementText + ">"); arthm.menu_name = xml.getText(); Log.d("tag", "value " + xml.getText() + " added"); } else if (elementText.equals("item")) { arthm.description = xml.getText(); Log.d("tag", "value " + xml.getText() + " added"); } else if (elementText.equals("SUBCATEGORY NAME")) { arthm.subcategoryDesc.add(xml.getText()); Log.d("tag", "value " + xml.getText() + " added"); } else if (elementText.equals("SUBCATEGORY DESC")) { arthm.subcategoryName.add(xml.getText()); Log.d("tag", "value " + xml.getText() + " added"); } break; case XmlPullParser.END_TAG: Log.d("tag", "case END_TAG"); nofTags += 1; String tags = Integer.toString(nofTags); Log.d("tags", elementText + " number of tags" + tags); if (xml.nextTag() != XmlPullParser.START_TAG) { xml.next(); } break; case XmlPullParser.END_DOCUMENT: Log.d("tag", "case END_DOCUMENT"); break; default: break; } } Log.d("tag", "Success!"); } Thanks in advance.

    Read the article

  • ruby-on-rails: update_attributes overrides model validations?

    - by cbrulak
    I have a typical, Post model: class Post< ActiveRecord::Base validates_presence_of :user_id #Line 1 validates_presence_of :title,:body #Line 2 in the controller, I have: def create if request.post? if login_required @post = Post.new(params[:post]) #Line 3 @post .update_attribute("user_id",session[:userid]) #Line 4 However, if the validations on Line 2 fail the Post will still be created, unless Line 4 is commented out. 1) Why? 2) Suggestions on a fix? Thanks

    Read the article

  • How can I modify/merge Jinja2 dictionaries?

    - by Brian M. Hunt
    I have a Jinja2 dictionary and I want a single expression that modifies it - either by changing its content, or merging with another dictionary. >>> import jinja2 >>> e = jinja2.Environment() Modify a dict: Fails. >>> e.from_string("{{ x[4]=5 }}").render({'x':{1:2,2:3}}) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "jinja2/environment.py", line 743, in from_string return cls.from_code(self, self.compile(source), globals, None) File "jinja2/environment.py", line 469, in compile self.handle_exception(exc_info, source_hint=source) File "<unknown>", line 1, in template jinja2.exceptions.TemplateSyntaxError: expected token 'end of print statement', got '=' Two-stage update: Prints superfluous "None". >>> e.from_string("{{ x.update({4:5}) }} {{ x }}").render({'x':{1:2,2:3}}) u'None {1: 2, 2: 3, 4: 5}' >>> e.from_string("{{ dict(x.items()+ {3:4}.items()) }}").render({'x':{1:2,2:3}}) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "jinja2/environment.py", line 868, in render return self.environment.handle_exception(exc_info, True) File "<template>", line 1, in top-level template code TypeError: <lambda>() takes exactly 0 arguments (1 given) Use dict(x,**y): Fails. >>> e.from_string("{{ dict((3,4), **x) }}").render({'x':{1:2,2:3}}) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "jinja2/environment.py", line 868, in render return self.environment.handle_exception(exc_info, True) File "<template>", line 1, in top-level template code TypeError: call() keywords must be strings So how does one modify the dictionary x in Jinja2 by changing an attribute or merging with another dictionary? This question is similar to: How can I merge two Python dictionaries as a single expression? -- insofar as Jinja2 and Python are analogous.

    Read the article

  • Binary file email attachment problem

    - by Alan Harris-Reid
    Hi there, Using Python 3.1.2 I am having a problem sending binary attachment files (jpeg, pdf, etc.) - MIMEText attachments work fine. The code in question is as follows... for file in self.attachments: part = MIMEBase('application', "octet-stream") part.set_payload(open(file,"rb").read()) encoders.encode_base64(part) part.add_header('Content-Disposition', 'attachment; filename="%s"' % file) msg.attach(part) # msg is an instance of MIMEMultipart() server = smtplib.SMTP(host, port) server.login(username, password) server.sendmail(from_addr, all_recipients, msg.as_string()) However, way down in the calling-stack (see traceback below), it looks as though msg.as_string() has received an attachment which creates a payload of 'bytes' type instead of string. Has anyone any idea what might be causing the problem? Any help would be appreciated. Alan builtins.TypeError: string payload expected: File "c:\Dev\CommonPY\Scripts\email_send.py", line 147, in send server.sendmail(self.from_addr, all_recipients, msg.as_string()) File "c:\Program Files\Python31\Lib\email\message.py", line 136, in as_string g.flatten(self, unixfrom=unixfrom) File "c:\Program Files\Python31\Lib\email\generator.py", line 76, in flatten self._write(msg) File "c:\Program Files\Python31\Lib\email\generator.py", line 101, in _write self._dispatch(msg) File "c:\Program Files\Python31\Lib\email\generator.py", line 127, in _dispatch meth(msg) File "c:\Program Files\Python31\Lib\email\generator.py", line 181, in _handle_multipart g.flatten(part, unixfrom=False) File "c:\Program Files\Python31\Lib\email\generator.py", line 76, in flatten self._write(msg) File "c:\Program Files\Python31\Lib\email\generator.py", line 101, in _write self._dispatch(msg) File "c:\Program Files\Python31\Lib\email\generator.py", line 127, in _dispatch meth(msg) File "c:\Program Files\Python31\Lib\email\generator.py", line 155, in _handle_text raise TypeError('string payload expected: %s' % type(payload))

    Read the article

  • How to bulk insert from CSV when some fields have new line character?

    - by z-boss
    I have a CSV dump from another DB that looks like this (id, name, notes): 1001,John Smith,15 Main Street 1002,Jane Smith,"2010 Rockliffe Dr. Pleasantville, IL USA" 1003,Bill Karr,2820 West Ave. The last field may contain carriage returns and commas, in which case it is surrounded by double quotes. I use this code to import CSV into my table: BULK INSERT CSVTest FROM 'c:\csvfile.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) SQL Server 2005 bulk insert cannot figure out that carriage returns inside quotes are not row terminators. How to overcome?

    Read the article

  • Nginx + Haproxy + Thin + Rails - 503 Service Unavailable -

    - by Luca G. Soave
    I don't know how troubleshoot this. I get "503 Service Unavailable" http error for all "nginx upstreams" proxy passing calls to haproxy fast_thin and slow_thin ( server 127.0.0.1:3100 and server 127.0.0.1:3200 ), which loadbalance on 6 Thin servers ( 127.0.0.1:3000 .. 3005 ). Static files like /blog are currently fine. The falldown is: nginx on port 80 - haproxy on 3100 and 3200 - thin on 3000 .. 3005 and then Rails. Here it is /etc/nginx/nginx.conf : user nginx; worker_processes 2; pid /var/run/nginx.pid; events { worker_connections 1024; } http { include /etc/nginx/mime.types; default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; sendfile on; tcp_nopush on; keepalive_timeout 65; tcp_nodelay on; include /etc/nginx/conf.d/*.conf; } then /etc/nginx/conf.d/default.conf upstream fast_thin { server 127.0.0.1:3100; } upstream slow_thin { server 127.0.0.1:3200; } server { listen 80; server_name www.gitwatcher.com; rewrite ^/(.*) http://gitwatcher.com/$1 permanent; } server { listen 80; server_name gitwatcher.com; access_log /var/www/gitwatcher/log/access.log; error_log /var/www/gitwatcher/log/error.log; root /var/www/gitwatcher/public; # index index.html; location /about { proxy_pass http://fast_thin; break; } location /trends { proxy_pass http://slow_thin; break; } location /categories { proxy_pass http://slow_thin; break; } location /signout { proxy_pass http://slow_thin; break; } location /auth/github { proxy_pass http://slow_thin; break; } location / { proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; proxy_redirect off; if (-f $request_filename/index.html) { rewrite (.*) $1/index.html break; } if (-f $request_filename.html) { rewrite (.*) $1.html break; } if (!-f $request_filename) { proxy_pass http://slow_thin; break; } } } then haproxy config file /etc/haproxy/haproxy.cfg : global log 127.0.0.1 local0 log 127.0.0.1 local1 notice #log loghost local0 info maxconn 4096 #chroot /usr/share/haproxy user haproxy group haproxy daemon #debug #quiet nbproc 1 # number of processing cores defaults log global retries 3 maxconn 2000 contimeout 5000 mode http clitimeout 60000 # maximum inactivity time on the client side srvtimeout 30000 # maximum inactivity time on the server side timeout connect 4000 # maximum time to wait for a connection attempt to a server to succeed option httplog option dontlognull option redispatch option httpclose # disable keepalive (HAProxy does not yet support the HTTP keep-alive mode) option abortonclose # enable early dropping of aborted requests from pending queue option httpchk # enable HTTP protocol to check on servers health option forwardfor # enable insert of X-Forwarded-For headers balance roundrobin # each server is used in turns, according to assigned weight stats enable # enable web-stats at /haproxy?stats stats auth haproxy:pr0xystats # force HTTP Auth to view stats stats refresh 5s # refresh rate of stats page listen rails_proxy 127.0.0.1:3100 # - equal weights on all servers # - maxconn will queue requests at HAProxy if limit is reached # - minconn dynamically scales the connection concurrency (bound my maxconn) depending on size of HAProxy queue # - check health every 20000 microseconds server web1 127.0.0.1:3000 weight 1 minconn 3 maxconn 6 check inter 20000 server web1 127.0.0.1:3001 weight 1 minconn 3 maxconn 6 check inter 20000 server web1 127.0.0.1:3002 weight 1 minconn 3 maxconn 6 check inter 20000 listen slow_proxy 127.0.0.1:3200 # cluster for slow requests, lower the queues, check less frequently server slow1 127.0.0.1:3003 weight 1 minconn 1 maxconn 3 check inter 40000 server slow2 127.0.0.1:3004 weight 1 minconn 1 maxconn 3 check inter 40000 server slow3 127.0.0.1:3005 weight 1 minconn 1 maxconn 3 check inter 40000 and the Thin config file /etc/thin/gitwatcher.yml : --- chdir: /var/www/gitwatcher environment: production address: 0.0.0.0 port: 3000 timeout: 30 log: log/thin.log pid: tmp/pids/thin.pid max_conns: 1024 max_persistent_conns: 100 require: [] wait: 30 servers: 6 daemonize: true if I look into open listen ports, I got the following : root@fullness:/var/www/gitwatcher# lsof | grep TCP | egrep "nginx|haproxy|thin" nginx 834 root 8u IPv4 921 0t0 TCP *:http (LISTEN) nginx 835 nginx 8u IPv4 921 0t0 TCP *:http (LISTEN) nginx 837 nginx 8u IPv4 921 0t0 TCP *:http (LISTEN) haproxy 1908 haproxy 4u IPv4 11699 0t0 TCP localhost:3100 (LISTEN) haproxy 1908 haproxy 6u IPv4 11701 0t0 TCP localhost:3200 (LISTEN) root@fullness:/var/www/gitwatcher# iptables -L get me the following : Chain INPUT (policy DROP) target prot opt source destination ACCEPT all -- anywhere anywhere state RELATED,ESTABLISHED ACCEPT all -- anywhere anywhere state RELATED,ESTABLISHED ACCEPT tcp -- anywhere anywhere tcp dpt:22222 ACCEPT tcp -- anywhere anywhere tcp dpt:http ACCEPT tcp -- anywhere anywhere tcp dpt:https ACCEPT all -- anywhere anywhere DROP all -- anywhere anywhere Chain FORWARD (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination ACCEPT all -- anywhere anywhere Any help ?

    Read the article

  • Is there a way in PHP to keep a file open and process each line and subsequent lines?

    - by Chris Denman
    I want to write a php script that keeps the apache_log file open and "listen" to the log and deal with each log entry as it happens. I'm thinking that I need to open a file, get the number of lines, then do this again in a loop - and when the size is different, read the new lines and process them. Am I barking up the wrong tree, or is there a silly easy solution that I have missed? Chris

    Read the article

  • How do I setup NInject? (I'm getting can't resolve "Bind", in the line "Bind<IWeapon>().To<Sword>()

    - by Greg
    Hi, I'm getting confused in the doco how I should be setting up Ninject. I'm seeing different ways of doing it, some v2 versus v1 confusion probably included... Question - What is the best way in my WinForms application to set things up for NInject (i.e. what are the few lines of code required). I'm assuming this would go into the MainForm Load method. In other words what code do I have to have prior to getting to: Bind<IWeapon>().To<Sword>(); I have the following code, so effectively I just want to get clarification on the setup and bind code that would be required in my MainForm.Load() to end up with a concrete Samurai instance? internal interface IWeapon { void Hit(string target); } class Sword : IWeapon { public void Hit(string target) { Console.WriteLine("Chopped {0} clean in half", target); } } class Samurai { private IWeapon _weapon; [Inject] public Samurai(IWeapon weapon) { _weapon = weapon; } public void Attack(string target) { _weapon.Hit(target); } } thanks PS. I've tried the following code, however I can't resolve the "Bind". Where does this come from? what DLL or "using" statement would I be missing? private void MainForm_Load(object sender, EventArgs e) { Bind<IWeapon>().To<Sword>(); // <== *** CAN NOT RESOLVE Bind *** IKernel kernel = new StandardKernel(); var samurai = kernel.Get<Samurai>();

    Read the article

  • Better viewing of postfix mail queue files than postcat?

    - by Geekman
    So I got a call early this morning about a client needing to see what email they have waiting to be delivered sitting in our secondary mail server. Their link for the main server had (still is) been down for two days and they needed to see their email. So I wrote up a quick perl script to use mailq in combination with postcat to dump each email for their address into separate files, tar'd it up and sent it off. Horrible code, I know, but it was urgent. My solution works OK in that it at least gives a raw view, but I thought tonight it would be nice if I had a solution where I could provide their email attachments and maybe remove some "garbage" header text as well. Most of the important emails seem to have a PDF or similar attached. I've been looking around but the only method of viewing queue files I can see is the postcat command, and I really don't want to write my own parser - so I was wondering if any of you have already done so, or know of a better command to use? Here's the code for my current solution: #!/usr/bin/perl $qCmd="mailq | grep -B 2 \"someemailaddress@isp\" | cut -d \" \" -f 1"; @data = split(/\n/, `$qCmd`); $i = 0; foreach $line (@data) { $i++; $remainder = $i % 2; if ($remainder == 0) { next; } if ($line =~ /\(/ || $line =~ /\n/ || $line eq "") { next; } print "Processing: " . $line . "\n"; `postcat -q $line > $line.email.txt`; $subject=`cat $line.email.txt | grep "Subject:"`; #print "SUB" . $subject; #`cat $line.email.txt > \"$subject.$line.email.txt\"`; } Any advice appreciated.

    Read the article

  • Can printf change its parameters??

    - by martani_net
    EDIT: complete code with main is here http://codepad.org/79aLzj2H and once again this is were the weird behavious is happening for (i = 0; i<tab_size; i++) { //CORRECT OUTPUT printf("%s\n", tableau[i].capitale); printf("%s", tableau[i].pays); printf("%s\n", tableau[i].commentaire); //WRONG OUTPUT //printf("%s --- %s --- %s |\n", tableau[i].capitale, tableau[i].pays, tableau[i].commentaire); } I have an array of the following strcuture struct T_info { char capitale[255]; char pays[255]; char commentaire[255]; }; struct T_info *tableau; This is how the array is populated int advance(FILE *f) { char c; c = getc(f); if(c == '\n') return 0; while(c != EOF && (c == ' ' || c == '\t')) { c = getc(f); } return fseek(f, -1, SEEK_CUR); } int get_word(FILE *f, char * buffer) { char c; int count = 0; int space = 0; while((c = getc(f)) != EOF) { if (c == '\n') { buffer[count] = '\0'; return -2; } if ((c == ' ' || c == '\t') && space < 1) { buffer[count] = c; count ++; space++; } else { if (c != ' ' && c != '\t') { buffer[count] = c; count ++; space = 0; } else /* more than one space*/ { advance(f); break; } } } buffer[count] = '\0'; if(c == EOF) return -1; return count; } void fill_table(FILE *f,struct T_info *tab) { int line = 0, column = 0; fseek(f, 0, SEEK_SET); char buffer[MAX_LINE]; char c; int res; int i = 0; while((res = get_word(f, buffer)) != -999) { switch(column) { case 0: strcpy(tab[line].capitale, buffer); column++; break; case 1: strcpy(tab[line].pays, buffer); column++; break; default: strcpy(tab[line].commentaire, buffer); column++; break; } /*if I printf each one alone here, everything works ok*/ //last word in line if (res == -2) { if (column == 2) { strcpy(tab[line].commentaire, " "); } //wrong output here printf("%s -- %s -- %s\n", tab[line].capitale, tab[line].pays, tab[line].commentaire); column = 0; line++; continue; } column = column % 3; if (column == 0) { line++; } /*EOF reached*/ if(res == -1) return; } return ; } Edit : trying this printf("%s -- ", tab[line].capitale); printf("%s --", tab[line].pays); printf("%s --\n", tab[line].commentaire); gives me as result -- --abi -- Emirats arabes unis I expect to get Abu Dhabi -- Emirats arabes unis -- Am I missing something?

    Read the article

  • Open a new tab in gnome-terminal using command line.

    - by Vikrant Chaudhary
    Hi, When I write gnome-terminal --tab at the terminal, I expect it to open a new tab in the same terminal window. But it opens a new window instead. I found out that its intention is to open a new tab in a new window, i.e., if I write gnome-terminal --tab --tab it will open a new window with two tabs. So, the question is, how can I open a new tab in the current window using a command in gnome-terminal? I'm using Ubuntu 9.04 x64.

    Read the article

  • An autoformat tool to automatically insert braces around single-line clauses?

    - by Brian Deacon
    So assuming that in our work environment we've decided that the One True Faith forbids this: if (something) doSomething(); And instead we want: if (something) { doSomething(); } Is there a tool that will do that automagically? Awesomest would be to have eclipse just do it, but since we're really just recovering from one ex-employee that thought he was too pretty for coding conventions, a less-friendly tool that would do it right just once would suffice.

    Read the article

  • What is the command line syntax to embed MPlayer in an app window in OS X?

    - by Chip
    I know the window ID, and I'm trying stuff like: mplayer -ontop -slave -quiet -wid 471165040 /t.mov mplayer -ontop -slave -quiet --window=471165040 /t.mov but nothing is working. (I know the window ID is correct) Otherwise, mplayer CLI is working great in its own window, just can't get it to play embedded. Can anyone point me to a syntax example of embedding mplayer in an app in OS X? thanks!

    Read the article

  • How do I convert tuple of tuples to list in one line (pythonic)?

    - by ThinkCode
    query = 'select mydata from mytable' cursor.execute(query) myoutput = cursor.fetchall() print myoutput (('aa',), ('bb',), ('cc',)) Why is it (cursor.fetchall) returning a tuple of tuples instead of a tuple since my query is asking for only one column of data? What is the best way of converting it to ['aa', 'bb', 'cc'] ? I can do something like this : mylist = [] myoutput = list(myoutput) for each in myoutput: mylist.append(each[0]) I am sure this isn't the best way of doing it. Please enlighten me!

    Read the article

  • how to insert new line in bash shell for variable??

    - by puspa
    I have two variables var1 and var2. The contents of each variables come from bash shell grep command. echo $var1 prints 123 465 326 8080 echo $var2 prints sila kiran hinal juku Now I want to print the above into following formats in Linux bash shell 123 sila 465 kiran 326 hinal 8080 juku So how can I print this way in bash shell??

    Read the article

  • How to recalculate all-pairs shorthest paths on-line if nodes are getting removed?

    - by Pavel Shved
    Latest news about underground bombing made me curious about the following problem. Assume we have a weighted undirected graph, nodes of which are sometimes removed. The problem is to re-calculate shortest paths between all pairs of nodes fast after such removals. With a simple modification of Floyd-Warshall algorithm we can calculate shortest paths between all pairs. These paths may be stored in a table, where shortest[i][j] contains the index of the next node on the shortest path between i and j (or NULL value if there's no path). The algorithm requires O(n³) time to build the table, and eacho query shortest(i,j) takes O(1). Unfortunately, we should re-run this algorithm after each removal. The other alternative is to run graph search on each query. This way each removal takes zero time to update an auxiliary structure (because there's none), but each query takes O(E) time. What algorithm can be used to "balance" the query and update time for all-pairs shortest-paths problem when nodes of the graph are being removed?

    Read the article

  • Split large text string into variable length strings without breaking words and keeping linebreaks a

    - by Frank
    I am trying to break a large string of text into several smaller strings of text and define each smaller text strings max length to be different. for example: "The quick brown fox jumped over the red fence. The blue dog dug under the fence." I would like to have code that can split this into smaller lines and have the first line have a max of 5 characters, the second line have a max of 11, and rest have a max of 20, resulting in this: Line 1: The Line 2: quick brown Line 3: fox jumped over the Line 4: red fence. Line 5: The blue dog Line 6: dug under the fence. All this in C# or MSSQL, is it possible?

    Read the article

< Previous Page | 187 188 189 190 191 192 193 194 195 196 197 198  | Next Page >