Search Results

Search found 10256 results on 411 pages for 'elastic map reduce'.

Page 20/411 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • How taxing would a game map grid be to a web browser?

    - by Vilx-
    Suppose we're making a strategy game (think Civilization) in a web browser. The game has a visible map portion - say 30x30 squares. Each square is 30x30px and has several overlaid images - the terrain, resources, units, roads, etc. The classical way of drawing this would be with a huge <table> where each cell would contain absolutely positioned images. It would probably be rendered in Javascript to reduce traffic. But it's still several thousand images and a huge table. Can the browser take it? Will the performance not drop below any acceptable limits? Alternatively I could keep a pre-rendered map image with as many overlays as possible, but that would be more work, I think.

    Read the article

  • How to create a 2D map of room by a few images/movie frames ?

    - by Nils
    I'd like to create a simple 2D map of a room by getting pictures (ceiling) of all directions (360° - e.g. movie frames), recognize the walls by edge detection, delete other unwanted objects, concat the images at the right position (cf. walls, panorama) and finally create the approximate 2D map (looking on it from above). Getting the scale would be another parameter, which might be useful. I have some own ideas at the moment, by using e.g. the Sobel algorithm, but it would be interesting if somebody out there knows some project or software (GPL,freeware prefered) doing this already, as I'm still looking for some examples, which might help me. Thanks.

    Read the article

  • including a std::map within a struct? Is it ok?

    - by user553514
    class X_class{ public: struct extra {int extra1; int extra2; int extra3; }; enum a { n,m}; struct x_struct{ char b; char c; int d; int e; std::map <int, extra> myExtraMap; }; }; in my code I define : x_struct myStruct; why do I get compile errors compiling the above class? The error either says: 1) expected ; before < on the line --- where I defined the map (above) if I eliminate std:: or 2) error: invalid use of ::; error: expected ; before < token

    Read the article

  • Python/YACC: Resolving a shift/reduce conflict

    - by Rosarch
    I'm using PLY. Here is one of my states from parser.out: state 3 (5) course_data -> course . (6) course_data -> course . course_list_tail (3) or_phrase -> course . OR_CONJ COURSE_NUMBER (7) course_list_tail -> . , COURSE_NUMBER (8) course_list_tail -> . , COURSE_NUMBER course_list_tail ! shift/reduce conflict for OR_CONJ resolved as shift $end reduce using rule 5 (course_data -> course .) OR_CONJ shift and go to state 7 , shift and go to state 8 ! OR_CONJ [ reduce using rule 5 (course_data -> course .) ] course_list_tail shift and go to state 9 I want to resolve this as: if OR_CONJ is followed by COURSE_NUMBER: shift and go to state 7 else: reduce using rule 5 (course_data -> course .) How can I fix my parser file to reflect this? Do I need to handle a syntax error by backtracking and trying a different rule?

    Read the article

  • Hadoop reduce task gets hung

    - by user806098
    I set up a hadoop cluster with 4 nodes, When running a map-reduce task, the map task finishes quickly, while the reduce task hangs at 27% percent. I checked the log, it's that the reduce task fails to fetch map output from map nodes. The job tracker log of master shows messages like this: 2011-06-27 19:55:14,748 INFO org.apache.hadoop.mapred.JobTracker: Adding task (REDUCE) 'attempt_201106271953_0001_r_000000_0' to tip task_201106271953_0001_r_000000, for tracker 'tracker_web30.bbn.com.cn:localhost/127.0.0.1:56476' And the name node log of master shows messages like this: 2011-06-27 14:00:52,898 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 54310, call register(DatanodeRegistration(202.106.199.39:50010, storageID=DS-1989397900-202.106.199.39-50010-1308723051262, infoPort=50075, ipcPort=50020)) from 192.168.225.19:16129: error: java.io.IOException: verifyNodeRegistration: unknown datanode 202.106.199.3 9:50010 However, neither the "web30.bbn.com.cn" or 202.106.199.39, 202.106.199.3 is the slave node. I think such ip/domains appear because hadoop fails to resolve a node(first in the Intranet DNS server), then it goes to a higher-level DNS server, later to the top, still fails, then the "junk" ip/domains are returned. But I checked my config, it goes like this: /etc/hosts: 127.0.0.1 localhost.localdomain localhost ::1 localhost6.localdomain6 localhost6 192.168.225.16 master 192.168.225.66 slave1 192.168.225.20 slave5 192.168.225.17 slave17 conf/core-site.xml: hadoop.tmp.dir /root/hadoop_tmp/hadoop_${user.name} fs.default.name hdfs://master:54310 io.sort.mb 1024 hdfs-site.xml: dfs.replication 3 masters: master slaves: master slave1 slave5 slave17 Also, all firewalls(iptables) are turned off, and ssh between each 2 nodes is ok. so I don't know where exact the error comes from. Please help. Thanks a lot.

    Read the article

  • Amazon Elastic MapReduce: Exception from FileSystem

    - by S.N
    Hi, I run my application using ruby client: ruby elastic-mapreduce -j j-20PEKMT9BRSUC --jar s3n://sakae55/lib/edu.cit.som.jar --main-class edu.cit.som.hadoop.SOMDriver --arg s3n://sakae55/repository/input/ecoli/ --arg s3n://sakae55/repository/output/ecoli/pl/ --arg s3n://sakae55/repository/data/ecoli/som.txt Then, I am seeing the following error: java.lang.IllegalArgumentException: This file system object (file:///) does not support access to the request path 'hdfs://i -10-195-207-230.ec2.internal:9000/mnt/var/lib/hadoop/tmp/mapred/system/job_201004221221_0017/job.jar' You possibly called Fi eSystem.get(conf) when you should of called FileSystem.get(uri, conf) to obtain a file system supporting your path. at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:320) at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:52) at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:416) at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:259) at org.apache.hadoop.fs.FileSystem.isDirectory(FileSystem.java:676) at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:200) at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1184) at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1160) at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1132) at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:662) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:729) at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1026) at edu.cit.som.hadoop.SOMDriver.runIteration(SOMDriver.java:106) at edu.cit.som.hadoop.SOMDriver.train(SOMDriver.java:69) at edu.cit.som.hadoop.SOMDriver.run(SOMDriver.java:52) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at edu.cit.som.hadoop.SOMDriver.main(SOMDriver.java:36) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:155) at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68) I am not sure why the error references to "file:///" even though all the arguments I pass do not use the schema.

    Read the article

  • Will An External Audio Interface Reduce CPU Load?

    - by Yar
    I am considering buying a very-low-latency audio interface like this one. One question is if it will reduce CPU load (I'm at about 60%+ and my Macbook has 2.4ghz and 4gigs ram) during intensive audio processing. If the answer is "yes," how will it compare to a different, cheaper firewire audio interface? My thought is that offloading the processing is always the same gain, regardless.

    Read the article

  • Does BitLocker reduce write reliability?

    - by Unsigned
    For the purposes of this question, BitLocker refers to the BitLocker-to-go variety on a disk with write-caching disabled. NTFS supports metadata journaling, which, although not completely failsafe, does mitigate certain types of potential filesystem errors. Assuming an NTFS volume is protected with BitLocker, does this reduce the failure tolerance? Would a power failure during a write to an NTFS volume, that's protected with BitLocker, be more prone to corruption than on an unencrypted NTFS volume?

    Read the article

  • Mongo Map Reduce first time

    - by James
    Hello guys, First time Map/Reduce user here, and using MongoDB. I have a lot of page visit data which I'd like to make some sense of by using Map/Reduce. Below is basically what I want to do, but as a total beginner a Map/Reduce, I think this is above my knowledge! Find all visits to current page where external = true within the last 30 days (unix timestamp, I deal with the date ranges in PHP and then the array, not mongo date) Group all visits by referral location For each referral location, calculate how many then went to visit a page which has a certain word in the [tags]. I'm using the normal Mongo PHP extension if that has an impact.

    Read the article

  • MapItemsControls not updating Silverlight Bing Map

    - by Matt
    I'm using a MapItemsControl to control my Pushpin items within my Bing silverlight map. Right on the page load, I add a new pin programatically, and the pin shows up on the map. However I've now taken it further and I'm adding pins to my datasource via a click on the map. The new pins add to my datasource, but do not show up on the map. Do I need to rebind my datasource to my map control or somehow refresh the datasource? Here's some code <UserControl.Resources> <DataTemplate x:Key="PinData"> <m:Pushpin Location="{Binding Location}" PositionOrigin="BottomCenter" Width="Auto" Height="Auto" Cursor="Hand"> <m:Pushpin.Template> <ControlTemplate> <Grid> <myTestApp:MasterPin DataContext="{Binding}"/> </Grid> </ControlTemplate> </m:Pushpin.Template> </m:Pushpin> </DataTemplate> </UserControl.Resources> <Grid x:Name="LayoutRoot" Background="White"> <m:Map x:Name="myMap" CredentialsProvider="" Mode="Road" ScaleVisibility="Collapsed" > <m:MapItemsControl x:Name="mapItems" ItemTemplate="{StaticResource PinData}"/> </m:Map> </Grid> public partial class Map : UserControl { private List< BasePin > dataSource = new List< BasePin >(); public Map() { InitializeComponent(); _Initialize(); } private void _Initialize() { //this part works and adds a pin to the map dataSource.Add( new BaseSite( -33.881532, 18.440208 ) ); myMap.MouseClick += Map_MouseClick; mapItems.ItemsSource = dataSource; } public void Map_MouseClick(object sender, MapMouseEventArgs e)) { BasePin pin = new BasePin(); pin.Location = myMap.ViewportPointToLocation( e.ViewportPoint ); dataSource.Add( pin ); } }

    Read the article

  • URGENT : Consuming apachesoap:Map complex datatype in webservice using .net with webservice methods

    - by aaaaaa
    web service methods written in java & a method return type is map how can i get value from map type? when i call getDetails then it return result properly but when i call getCit method it return a nothing in response because of map type. i .net there is no data type like map type Dim objweb As New WebRef.FetchService Dim oht As WebRef.Pat Dim ohtm As New WebRef.Map oht = objweb.getDetails("17") ohtm = objweb.getCit(New String() {"21"}, aa, 1) thx

    Read the article

  • Hadoop WordCount example stuck at map 100% reduce 0%

    - by Abhinav Sharma
    [hadoop-1.0.2] ? hadoop jar hadoop-examples-1.0.2.jar wordcount /user/abhinav/input /user/abhinav/output Warning: $HADOOP_HOME is deprecated. ****hdfs://localhost:54310/user/abhinav/input 12/04/15 15:52:31 INFO input.FileInputFormat: Total input paths to process : 1 12/04/15 15:52:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 12/04/15 15:52:31 WARN snappy.LoadSnappy: Snappy native library not loaded 12/04/15 15:52:31 INFO mapred.JobClient: Running job: job_201204151241_0010 12/04/15 15:52:32 INFO mapred.JobClient: map 0% reduce 0% 12/04/15 15:52:46 INFO mapred.JobClient: map 100% reduce 0% I've set up hadoop on a single node using this guide (http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/#run-the-mapreduce-job) and I'm trying to run a provided example but I'm getting stuck at map 100% reduce 0%. What could be causing this?

    Read the article

  • Adding simple marker clusterer to google map

    - by take2
    Hi, I'm having problems with adding marker clusterer functionality to my map. What I want is to use custom icon for my markers and every marker has its own info window which I want to be able to edit. I did accomplish that, but now I have problems adding marker clusterer library functionality. I read something about adding markers to array, but I'm not sure what would it exactly mean. Besides, all of the examples with array I have found, don't have info windows and searching through the code I didn't find appropriate way to add them. Here is my code (mostly from Geocodezip.com): <script type="text/javascript" src="http://maps.google.com/maps/api/js?sensor=false"></script> <script type="text/javascript" src="http://google-maps-utility-library-v3.googlecode.com/svn/trunk/markerclusterer/src/markerclusterer.js"></script> <style type="text/css"> html, body { height: 100%; } </style> <script type="text/javascript"> //<![CDATA[ var map = null; function initialize() { var myOptions = { zoom: 8, center: new google.maps.LatLng(43.907787,-79.359741), mapTypeControl: true, mapTypeControlOptions: {style: google.maps.MapTypeControlStyle.DROPDOWN_MENU}, navigationControl: true, mapTypeId: google.maps.MapTypeId.ROADMAP } map = new google.maps.Map(document.getElementById("map_canvas"), myOptions); var mcOptions = {gridSize: 50, maxZoom: 15}; var mc = new MarkerClusterer(map, [], mcOptions); google.maps.event.addListener(map, 'click', function() { infowindow.close(); }); // Add markers to the map // Set up three markers with info windows var point = new google.maps.LatLng(43.65654,-79.90138); var marker1 = createMarker(point,'Abc'); var point = new google.maps.LatLng(43.91892,-78.89231); var marker2 = createMarker(point,'Abc'); var point = new google.maps.LatLng(43.82589,-79.10040); var marker3 = createMarker(point,'Abc'); var markerArray = new Array(marker1, marker2, marker3); mc.addMarkers(markerArray, true); } var infowindow = new google.maps.InfoWindow( { size: new google.maps.Size(150,50) }); function createMarker(latlng, html) { var image = '/321.png'; var contentString = html; var marker = new google.maps.Marker({ position: latlng, map: map, icon: image, zIndex: Math.round(latlng.lat()*-100000)<<5 }); google.maps.event.addListener(marker, 'click', function() { infowindow.setContent(contentString); infowindow.open(map,marker); }); } //]]> </script>

    Read the article

  • Sorting and Re-arranging List of HashMaps

    - by HonorGod
    I have a List which is straight forward representation of a database table. I am trying to sort and apply some magic after the data is loaded into List of HashMaps. In my case this is the only hard and fast way of doing it becoz I have a rules engine that actually updates the values in the HashMap after several computations. Here is a sample data representation of the HashMap (List of HashMap) - {fromDate=Wed Mar 17 10:54:12 EDT 2010, eventId=21, toDate=Tue Mar 23 10:54:12 EDT 2010, actionId=1234} {fromDate=Wed Mar 17 10:54:12 EDT 2010, eventId=11, toDate=Wed Mar 17 10:54:12 EDT 2010, actionId=456} {fromDate=Sat Mar 20 10:54:12 EDT 2010, eventId=20, toDate=Thu Apr 01 10:54:12 EDT 2010, actionId=1234} {fromDate=Wed Mar 24 10:54:12 EDT 2010, eventId=22, toDate=Sat Mar 27 10:54:12 EDT 2010, actionId=1234} {fromDate=Wed Mar 17 10:54:12 EDT 2010, eventId=11, toDate=Fri Mar 26 10:54:12 EDT 2010, actionId=1234} {fromDate=Sat Mar 20 10:54:12 EDT 2010, eventId=11, toDate=Wed Mar 31 10:54:12 EDT 2010, actionId=1234} {fromDate=Mon Mar 15 10:54:12 EDT 2010, eventId=12, toDate=Wed Mar 17 10:54:12 EDT 2010, actionId=567} I am trying to achieve couple of things - 1) Sort the list by actionId and eventId after which the data would look like - {fromDate=Wed Mar 17 10:54:12 EDT 2010, eventId=11, toDate=Wed Mar 17 10:54:12 EDT 2010, actionId=456} {fromDate=Mon Mar 15 10:54:12 EDT 2010, eventId=12, toDate=Wed Mar 17 10:54:12 EDT 2010, actionId=567} {fromDate=Wed Mar 24 10:54:12 EDT 2010, eventId=22, toDate=Sat Mar 27 10:54:12 EDT 2010, actionId=1234} {fromDate=Wed Mar 17 10:54:12 EDT 2010, eventId=21, toDate=Tue Mar 23 10:54:12 EDT 2010, actionId=1234} {fromDate=Sat Mar 20 10:54:12 EDT 2010, eventId=20, toDate=Thu Apr 01 10:54:12 EDT 2010, actionId=1234} {fromDate=Wed Mar 17 10:54:12 EDT 2010, eventId=11, toDate=Fri Mar 26 10:54:12 EDT 2010, actionId=1234} {fromDate=Sat Mar 20 10:54:12 EDT 2010, eventId=11, toDate=Wed Mar 31 10:54:12 EDT 2010, actionId=1234} 2) If we group the above list by actionId they would be resolved into 3 groups - actionId=1234, actionId=567 and actionId=456. Now here is my question - For each group having the same eventId, I need to update the records so that they have wider fromDate to toDate. Meaning, if you consider the last two rows they have same actionId = 1234 and same eventId = 11. Now we can to pick the least fromDate from those 2 records which is Wed Mar 17 10:54:12 and farther toDate which is Wed Mar 31 10:54:12 and update those 2 record's fromDate and toDate to Wed Mar 17 10:54:12 and Wed Mar 31 10:54:12 respectively. Any ideas? PS: I already have some pseudo code to start with. import java.util.ArrayList; import java.util.Calendar; import java.util.Collections; import java.util.Comparator; import java.util.Date; import java.util.HashMap; import java.util.List; import org.apache.commons.lang.builder.CompareToBuilder; public class Tester { boolean ascending = true ; boolean sortInstrumentIdAsc = true ; boolean sortEventTypeIdAsc = true ; public static void main(String args[]) { Tester tester = new Tester() ; tester.printValues() ; } public void printValues () { List<HashMap<String,Object>> list = new ArrayList<HashMap<String,Object>>() ; HashMap<String,Object> map = new HashMap<String,Object>(); map.put("actionId", new Integer(1234)) ; map.put("eventId", new Integer(21)) ; map.put("fromDate", getDate(1) ) ; map.put("toDate", getDate(7) ) ; list.add(map); map = new HashMap<String,Object>(); map.put("actionId", new Integer(456)) ; map.put("eventId", new Integer(11)) ; map.put("fromDate", getDate(1)) ; map.put("toDate", getDate(1) ) ; list.add(map); map = new HashMap<String,Object>(); map.put("actionId", new Integer(1234)) ; map.put("eventId", new Integer(20)) ; map.put("fromDate", getDate(4) ) ; map.put("toDate", getDate(16) ) ; list.add(map); map = new HashMap<String,Object>(); map.put("actionId", new Integer(1234)) ; map.put("eventId", new Integer(22)) ; map.put("fromDate",getDate(8) ) ; map.put("toDate", getDate(11)) ; list.add(map); map = new HashMap<String,Object>(); map.put("actionId", new Integer(1234)) ; map.put("eventId", new Integer(11)) ; map.put("fromDate",getDate(1) ) ; map.put("toDate", getDate(10) ) ; list.add(map); map = new HashMap<String,Object>(); map.put("actionId", new Integer(1234)) ; map.put("eventId", new Integer(11)) ; map.put("fromDate",getDate(4) ) ; map.put("toDate", getDate(15) ) ; list.add(map); map = new HashMap<String,Object>(); map.put("actionId", new Integer(567)) ; map.put("eventId", new Integer(12)) ; map.put("fromDate", getDate(-1) ) ; map.put("toDate",getDate(1)) ; list.add(map); System.out.println("\n Before Sorting \n "); for(int j = 0 ; j < list.size() ; j ++ ) System.out.println(list.get(j)); Collections.sort ( list , new HashMapComparator2 () ) ; System.out.println("\n After Sorting \n "); for(int j = 0 ; j < list.size() ; j ++ ) System.out.println(list.get(j)); } public static Date getDate(int days) { Calendar cal = Calendar.getInstance(); cal.setTime(new Date()); cal.add(Calendar.DATE, days); return cal.getTime() ; } public class HashMapComparator2 implements Comparator { public int compare ( Object object1 , Object object2 ) { if ( ascending == true ) { return new CompareToBuilder() .append(( ( HashMap ) object1 ).get ( "actionId" ), ( ( HashMap ) object2 ).get ( "actionId" )) .append(( ( HashMap ) object2 ).get ( "eventId" ), ( ( HashMap ) object1 ).get ( "eventId" )) .toComparison(); } else { return new CompareToBuilder() .append(( ( HashMap ) object2 ).get ( "actionId" ), ( ( HashMap ) object1 ).get ( "actionId" )) .append(( ( HashMap ) object2 ).get ( "eventId" ), ( ( HashMap ) object1 ).get ( "eventId" )) .toComparison(); } } } }

    Read the article

  • Custom dynamic error pages in Ruby on Rails not working

    - by PlanetMaster
    Hi, I'm trying to implement custom dynamic error pages following this post: http://www.perfectline.co.uk/blog/custom-dynamic-error-pages-in-ruby-on-rails I did exactly what the blog post says. I included config.action_controller.consider_all_requests_local = false in my environment.rb. But is not working. My browser shows: Routing Error No route matches "/555" with {:method=>:get} So, it looks like the rescues are not fired. I get the following in my log file: ActionController::RoutingError (No route matches "/555" with {:method=>:get}): Rendering rescues/layout (not_found) Is there some routing interfering with the code? I'm not sure what to look for. I'm running rails 2.3.5. Here is the routes.rb file: ActionController::Routing::Routes.draw do |map| # routing van property-url map.connect 'buy/:property_type_plural/:province/:city/:address/:house_number', :controller => 'properties' , :action => 'show', :id => 'whatever' map.myimmonatie 'myimmonatie' , :controller => 'myimmonatie/properties', :action => 'index' map.login "login", :controller => "user_sessions", :action => "create", :conditions => {:method => :post} map.login "login", :controller => "user_sessions", :action => "new" map.logout "logout", :controller => "user_sessions", :action => "destroy" map.buy "buy", :controller => 'buy' map.sell "sell", :controller => 'sell' map.home "home", :controller => 'home' map.disclaimer "disclaimer", :controller => 'disclaimer' map.sign_up "sign_up", :controller => 'users', :action => :new map.contact "contact", :controller => 'contact' map.resources :user_sessions map.resources :contact map.resources :password_resets map.resources :messages map.resources :users, :only => [:index,:new,:create,:activate,:edit,:profile,:password] map.resources :images map.resources :activation , :only => [:new,:resend] map.resources :email map.resources :properties, :except => [:index,:destroy] map.namespace :admin do |admin| admin.resources :users admin.resources :properties admin.resources :order_items, :as => :orders admin.resources :blog_posts, :as => :blog end map.connect 'myimmonatie/:action' , :controller => 'users', :id => 'current', :requirements => {:action => /(profile)|(password)|(email)/} map.namespace :myimmonatie do |myimmonatie| myimmonatie.resources :messages, :controller => 'messages' myimmonatie.resources :password, :as => "password", :controller => 'users', :action => 'password' myimmonatie.resources :properties , :controller => 'properties' myimmonatie.resources :orders , :only => [:index,:show,:create,:new] end map.root :controller => "home" map.connect ':controller/:action' map.connect ':controller/:action/:id' map.connect ':controller/:action/:id.:format' end ActionController::Routing::Translator.translate_from_file('config','i18n-routes.yml')

    Read the article

  • Java reduce CPU usage

    - by steve
    Greets- We gots a few nutters in work who enjoy using while(true) { //Code } in their code. As you can imagine this maxes out the CPU. Does anyone know ways to reduce the CPU utilization so that other people can use the server as well. The code itself is just constantly polling the internet for updates on sites. Therefore I'd imagine a little sleep method would greatly reduce the the CPU usage. Also all manipulation is being done in String objects (Java) anyone know how much StringBuilders would reduce the over head by? Thanks for any pointers

    Read the article

  • How to close Blackberry map programmatically

    - by nhd
    Hi all, please help me close Blackberry map programmatically from my application. I first call BB map and pass a map argument and it display seem smoothly, but in the second, when I change my map argument, BB map doesn't respond with the new argument, it keeps the old. I think I must close it before pass the new argument but don't know how to do that. Thank you.

    Read the article

  • Hadoop Map Reduce job never finishes

    - by rohanbk
    I am running a Hadoop Map Reduce job using a Python Mapper and Reducer script, and Hadoop Streaming. Both my Map and Reduce jobs run till they are both 100%, but the job doesn't end. I know that when things go sour, Hadoop will terminate the job, but in this case, both stages reach a 100% and just never end. Has anyone else encountered anything similar? Also, how do I debug my program to figure out where things are going wrong? If I use a smaller input file, and I just run something like: $> cat input_file | mapper.py | sort | reduce.py >> output_file everything works perfectly fine. However, when I use Hadoop, things don't work out.

    Read the article

  • Google Maps: Traffic on top of Custom Map

    - by Kyle
    I want to add traffic information to my custom map. Currently I'm using a Tile Layer Overlay on my Google Map to display custom map tiles. When I try to add GTrafficOverlay to my map, my custom map tiles display above the traffic information. Is there any way to display the traffic above my GTileLayerOverlay? (Using the JavaScript api)

    Read the article

  • How to filter a persistent map in Clojure?

    - by Checkers
    I have a persistent map which I want to filter. Something like this: (filter #(-> % val (= 1)) {:a 1 :b 1 :c 2}) The above comes out as ([:a 1] [:b 1]) (a lazy sequence of map entries). However I want to be get {:a 1 :b 1}. How can I filter a map so it remains a map without having to rebuild it from a sequence of map entries?

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >