Search Results

Search found 743 results on 30 pages for 'fetching'.

Page 7/30 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • Smaller Chunks of Data

    - by Googler
    I am using a web service to fetch a large amount of data. While sending the request, i receive an error as: Error: ** Please request data in smaller chunks.** Is this a problem with the web services i am fetching the information or a normal rule for fetching the data through internet. May i know the cause of this problem and also how to send the data in smaller chunks to avoid this error.

    Read the article

  • Dota map record tracking or saving in DB

    - by ajay009ajay
    Hello All, I want to fetch game record of Dota map. what player won, loss, kills or death ? I am fetching value, using PGPVN server but this not fetching custom map war craft III game. I did more and more googling. what is the perfect way to fetch this Dota map record ???? thanks in advance.

    Read the article

  • Using PHP to populate a <select></select> ?

    - by Flins
    <select name="select"> </select> I want to populate the above tag with values from database. I have written php code up to this. while($row=mysql_fetch_array($result)) { } $row is fetching fetching correct values.. how to add it to the <select> please help... Am new to programming

    Read the article

  • Sorting mysql array value after string chars swapped from fetched DB data.

    - by Shail Patel
    I want to sort one column fetched from mysql DB, and stored in an array. After fetching I am doing below steps. 1- DB Fetching fields array in row format. ->Field1, Field2, Field3, Field4, Field5 2- From that fields array One columns data [Field3], swapping string keywords. eg. AB013, DB131, RS001 to->013AB, 131DB, 001RS Now I want to sort above value in new string format like-> 001RS, 013AB, 131DB

    Read the article

  • What does CPU Time consist of? [closed]

    - by Sid
    What does CPU time exactly consist of? For instance, is the time taken to access a page from the RAM (at which point, the CPU is most likely idling) part of the CPU time? I'm not talking about fetching the page from the disk here, just fetching it from the RAM. Thanks

    Read the article

  • ??Data Guard???????Redo GAP

    - by JaneZhang(???)
      ?Oracle Data Guard?,Redo Gap??????????????????redo????????????,?????????redo??????????,?????????????:ARC:????MRP:Media Recovery Process,????????redoRFS:Remote File Server ,???????????redo??FAL:Fetch Archive Log????:?????????gap?,??????????gap?????:Oracle 11.2.0.2 on Linux 5.????:1.?????????????:Primary:MAX(SEQUENCE#)--------------           86Standby:MAX(SEQUENCE#)--------------           862. ??????,??gap:????????: #ifconfig eth0 down???????switch logfile:SQL>alter system switch logfile;SQL>alter system switch logfile;...Primary:MAX(SEQUENCE#)--------------           96????alert log?????????????:TNS-00513: Destination host unreachable   nt secondary err code: 101   nt OS err code: 0Error 12543 received logging on to the standbyFAL[server, ARCp]: Error 12543 creating remote archivelog file 'STANDBY'FAL[server, ARCp]: FAL archive failed, see trace file.ARCH: FAL archive failed. Archiver continuingORACLE Instance orcl - Archival Error. Archiver continuing.3.??????????????,????????????:mv *.arc ../4. ???????:#ifconfig eth0 up5.??,???ARC???????????????????MRP???gap??gap fetching.??alert log:Thu Mar 29 19:58:49 2012Media Recovery Waiting for thread 1 sequence 87 (in transit) <====  ?????,??87...Thu Mar 29 20:08:45 2012...Media Recovery Waiting for thread 1 sequence 94Thu Mar 29 20:11:01 2012RFS[61]: Assigned to RFS process 13643RFS[61]: Opened log for thread 1 sequence 97 dbid 1285401128 branch 757620395Archived Log entry 80 added for thread 1 sequence 97 rlc 757620395 ID 0x4c9d8928 dest 2:Thu Mar 29 20:11:02 2012RFS[62]: Assigned to RFS process 13645RFS[62]: Selected log 4 for thread 1 sequence 98 dbid 1285401128 branch 757620395Thu Mar 29 20:11:02 2012Primary database is in MAXIMUM PERFORMANCE modeRe-archiving standby log 4 thread 1 sequence 98Thu Mar 29 20:11:02 2012Archived Log entry 81 added for thread 1 sequence 98 ID 0x4c9d8928 dest 1:RFS[63]: Assigned to RFS process 13647RFS[63]: Selected log 4 for thread 1 sequence 99 dbid 1285401128 branch 757620395Thu Mar 29 20:11:05 2012Fetching gap sequence in thread 1, gap sequence 94-96 <===========?gap...6.??MRP?trace,?????MRP ??fetching gap:MRP trace:*** 2012-03-29 20:08:45.375 4265 krsh.cMedia Recovery Waiting for thread 1 sequence 94*** 2012-03-29 20:11:05.543*** 2012-03-29 20:11:05.543 4265 krsh.cFetching gap sequence in thread 1, gap sequence 94-96 <==========MRP?gap.Redo shipping client performing standby login*** 2012-03-29 20:11:05.593 4595 krsu.cLogged on to standby successfullyClient logon and security negotiation successful!7.????????????,???RFS????????, MRP ????????apply.Thu Mar 29 20:12:06 2012RFS[64]: Assigned to RFS process 13649RFS[64]: Opened log for thread 1 sequence 94 dbid 1285401128 branch 757620395Archived Log entry 82 added for thread 1 sequence 94 rlc 757620395 ID 0x4c9d8928 dest 2:Thu Mar 29 20:12:06 2012RFS[65]: Assigned to RFS process 13651RFS[65]: Opened log for thread 1 sequence 95 dbid 1285401128 branch 757620395Thu Mar 29 20:12:06 2012RFS[66]: Assigned to RFS process 13653RFS[66]: Opened log for thread 1 sequence 96 dbid 1285401128 branch 757620395Archived Log entry 83 added for thread 1 sequence 95 rlc 757620395 ID 0x4c9d8928 dest 2:Archived Log entry 84 added for thread 1 sequence 96 rlc 757620395 ID 0x4c9d8928 dest 2:Thu Mar 29 20:12:16 2012Media Recovery Log /home/oracle/arch1/standby/1_94_757620395.arcMedia Recovery Log /home/oracle/arch1/standby/1_95_757620395.arcMedia Recovery Log /home/oracle/arch1/standby/1_96_757620395.arcMedia Recovery Log /home/oracle/arch1/standby/1_97_757620395.arcMedia Recovery Log /home/oracle/arch1/standby/1_98_757620395.arc????:????????,????gap???,???ARC?????????gap??,????????????MRP???apply log??????gap,???????FAL????? ?:?11g,??????ARC??????RFS?MRP?????????????gap. 8. ????????MRP??FAL??gap??,????????????,??MRP?trace???:FAL[client, MRP0],?????FAL??? *** 2012-03-29 21:18:15.964 4265 krsh.cError 1031 received logging on to the standby*** 2012-03-29 21:18:15.964 4265 krsh.cFAL[client, MRP0]: Error 1031 connecting to PRIMARY for fetching gap sequence

    Read the article

  • JDO difficulties in retrieving persistent vector

    - by Michael Omer
    I know there are already some posts regarding this subject, but although I tried using them as a reference, I am still stuck. I have a persistent class as follows: @PersistenceCapable(identityType = IdentityType.APPLICATION) public class GameObject implements IMySerializable{ @PrimaryKey @Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY) protected Key m_databaseKey; @NotPersistent private final static int END_GAME_VAR = -1000; @Persistent(defaultFetchGroup = "true") protected GameObjectSet m_set; @Persistent protected int m_databaseType = IDatabaseAccess.TYPE_NONE; where GameObjectSet is: @PersistenceCapable(identityType = IdentityType.APPLICATION) @FetchGroup(name = "mySet", members = {@Persistent(name = "m_set")}) public class GameObjectSet { @PrimaryKey @Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY) private Key id; @Persistent private Vector<GameObjectSetPair> m_set; and GameObjectSetPair is: @PersistenceCapable(identityType = IdentityType.APPLICATION) public class GameObjectSetPair { @PrimaryKey @Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY) private Key id; @Persistent private String key; @Persistent(defaultFetchGroup = "true") private GameObjectVar value; When I try to fetch the entire structure by fetching the GameObject, the set doesn't have any elements (they are all null) I tried adding the fetching group to the PM, but to no avail. This is my fetching code Vector<GameObject> ret = new Vector<GameObject>(); PersistenceManager pm = PMF.get().getPersistenceManager(); pm.getFetchPlan().setMaxFetchDepth(-1); pm.getFetchPlan().addGroup("mySet"); Query myQuery = pm.newQuery(GameObject.class); myQuery.setFilter("m_databaseType == objectType"); myQuery.declareParameters("int objectType"); try { List<GameObject> res = (List<GameObject>)myQuery.execute(objectType); ret = new Vector<GameObject>(res); for (int i = 0; i < ret.size(); i++) { ret.elementAt(i).getSet(); ret.elementAt(i).getSet().touchSet(); } } catch (Exception e) { } finally { pm.close(); } Does anyone have any idea? Thanks Mike

    Read the article

  • IMAP protocol support in different email servers

    - by raticulin
    Having to interact with several different email servers via IMAP (using javamail), I have found that there is a very different level of support for IMAP features among them. The lack of support of some features has resulted in more developing time, more complicated code to deal with different support, worse perforamance due to not being able to SEARCH etc. So I would like to get some info on other servers and what level of support they provide. So far I have dealt with Lotus Domino and Novell GroupWise (and to a lesser extend Exchange 2003 and 2007). I am particularly interested in most used one in unix/linux (Courier, Cyrus, Dovecot, UW IMAP) and also Zimbra, but feel free to add any you know. Also welcomed info about online services like gmail. Features that I consider (comment if you are interested in others and I'll add them. Custom flags Searching Custom flags Searching arbitrary headers Partial fetching Proxy authentication And what I have found so far (correct if I am wrong anywhere): Lotus Domino Custom flags yes Searching Custom flags yes Searching arbitrary headers yes Partial fetching ? Proxy authentication sort of, you can give some user permissions to access other users mailboxes and he will see them under his '\Other Users' folder Novell GroupWise Custom flags No Searching Custom flags No Searching arbitrary headers No Partial fetching ? Proxy authentication yes, you can use what is called a Trusted Application

    Read the article

  • Android listing design problem with cursors

    - by Priyank
    Hi. I have a following situation in my android app. I have an app that fetches messages from inbox, sent items and drafts based on search keywords. I use to accomplish this by fetching cursors for each manually based on selection by user and then populating them in a custom data holder object. Filter those results based on given keywords and then manually render view with respective data. Someone suggested that I should use a custom Cursor adapter to bind view and my cursor data. So I tried doing that. Now what I am doing is this: Fetch individual cursors for inbox, sent items and drafts. Merge them into one using Merge cursor and then pass that back to my CursorAdapter implmentation. Now where or how do I filter my cursor data based on keywords; because now binding will ensure that they are directly rendered to view on list. Also, some post fetching operation like fetching sender's contact pic and all will be something that I do not want to move to adapter. If I do all this processing in adapter; it'll be heavy and ugly. How could I have designed it better such that it performs and the responsibilities are shared and distributed. Any ideas will be helpful.

    Read the article

  • PHP classes totally forgotten something today - sorry

    - by russp
    Hi guys, really sorry about being "totally thick today" but I have forgotten how to do something simple - too much time not in php recently. Want to use the OS phpapi How do I print out the individual rows - see told you I was being thick today // The fields we will be fetching. if (isset($_GET['test']) && $_GET['test'] == 'plaxo') { // plaxo is a PortableContacts end-point so doesn't know about the OpenSocial specific fields $profile_fields = array(); } else { $profile_fields = array( 'aboutMe', 'displayName', 'bodyType', 'currentLocation', 'drinker', 'happiestWhen', 'lookingFor' ); } // The number of friends to fetch. $friend_count = 2; $batch = $osapi->newBatch(); // Fetch the current user. $self_request_params = array( 'userId' => $userId, // Person we are fetching. 'groupId' => '@self', // @self for one person. 'fields' => $profile_fields // Which profile fields to request. ); $batch->add($osapi->people->get($self_request_params), 'self'); // Fetch the friends of the user $friends_request_params = array( 'userId' => $userId, // Person whose friends we are fetching. 'groupId' => '@friends', // @friends for the Friends group. 'fields' => $profile_fields, // Which profile fields to request. 'count' => $friend_count // Max friends to fetch. ); $batch->add($osapi->people->get($friends_request_params), 'friends'); // Get supportedFields Request $batch->add($osapi->people->getSupportedFields(), 'supportedFields'); // Send the batch request. $result = $batch->execute(); Say I wanted to print out "aboutMe", whats the echo? cos echo $result['aboutMe'] doesn't work.

    Read the article

  • TakeWhile and SkipWhile method in LINQ

    - by vik20000in
     In my last post I talked about how to use the take and the Skip keyword to filter out the number of records that we are fetching. But there is only problem with the take and skip statement. The problem lies in the dependency where by the number of records to be fetched has to be passed to it. Many a times the number of records to be fetched is also based on the query itself. For example if we want to continue fetching records till a certain condition is met on the record set. Let’s say we want to fetch records from the array of number till we get 7. For this kind of query LINQ has exposed the TakeWhile Method.     int[] numbers = { 5, 4, 1, 3, 9, 8, 6, 7, 2, 0 };     var firstNumbersLessThan6 = numbers.TakeWhile(n => n < 7);   In the same way we can also use the SkipWhile statement. The skip while statement will skip all the records that do not match certain condition provided. In the example below we are skiping all those number which are not divisible by 3. Remember we could have done this with where clause also, but SkipWhile method can be useful in many other situation and hence the example and the keyword.     int[] numbers = { 5, 4, 1, 3, 9, 8, 6, 7, 2, 0 };     var allButFirst3Numbers = numbers.SkipWhile(n => n % 3 != 0); Vikram

    Read the article

  • ArchBeat Link-o-Rama for November 21, 2012

    - by Bob Rhubart
    Fault Handling and Prevention - Part 1 | Guido Schmutz and Ronald van Luttikhuizen In this technical article, part one of a four part series, Oracle ACE Directors Guido Schmutz and Ronald van Luttikhuizen guide you through an introduction to fault handling in a service-oriented environment using Oracle SOA Suite and Oracle Service Bus. One Stop Shop for Oracle Webcasts Webcasts can be a great way to get information about Oracle products without having to go cross-eyed reading yet another document off your computer screen. Oracle's new Webcast Center offers selectable filtering to make it easy to get to the information you want. Yes, you have to register to gain access, but that process is quick, and with over 200 webcasts to choose from you know you'll find useful content. Oracle on Oracle: Is that all? (Identity Management)| Darin Pendergraft Darin Pendergraft shares a discussion with Jaime Cardoso aboutthe latter's experience with Oracle's IDM products. What's particularly interesting is that the discussion grew out of Jaime's highly critical comment that Darin missed important pointsabout those products in an earlier interview Chirag Andani. If that ain't social engagement, I don't know what is. I.T. Chargeback : Core to Cloud Computing | Zero to Cloud "While chargeback has existed as a concept for many years (especially in mainframe environments), it is the move to this self-service model that has created a need for a new breed of chargeback applications for cloud," says Mark McGill. "Enabling self-service without some form of chargeback is like opening a shop where all of the goods are free." New Self-paced Online Oracle BPM 11g Developer Training | Dan Atwood Oracle ACE Dan Atwood of Avio Consulting shares a lot of information about a new Oracle BPM 11g Developer Workshop. JPA SQL and Fetching tuning ( EclipseLink ) | Edwin Biemond Oracle ACE Edwin Biemond's post illustrates how to "use the department and employee entity of the HR Oracle demo schema to explain the JPA options you have to control the SQL statements and the JPA relation Fetching." Thought for the Day "Team development is like a birthday cake. Everybody gets a piece." — Assaad Chalhoub Source: SoftwareQuotes.com

    Read the article

  • CRL checking problem windows 2003

    - by Tim Mahy
    Hi all, we have CRL that is valid for 24 hours and has a next update in 12 hours. The CRL is valid from 12:12 AM to 12:12 AM and from 12:12 PM to 12:12 PM. In the logs of the CRL hosting webserver we see that one of our servers not always fetches the CRL at night, in most cases the server that missed the CRL IIS servers 403.16 on 12:13 PM. Is our following theory good: when a windows server misses fetching the CRL on it's nextUpdate but the current CRL is still valid, the fetching is not retried? This leads to a situation that when the CRL expires there is no overlap and gives a little time of 403.16 situations in IIS since the CRL is not thrusted and so all certificates are marked als unsafe? greetings, Tim

    Read the article

  • Using Google App Engine to Perform World Updates vs an Authoritative Server

    - by Error 454
    I am considering different game server architectures that use GAE. The types of games I am considering are turn-based where the world status would need to be updated about once per minute. I am looking for an answer that persuades me to either perform the world update on the google servers OR an authoritative server that syncs with the datastore. The main goal here would be to minimize GAE daily quotas. For some rough numbers, I am assuming 10,000 entities requiring updates. Each entity update would require: Reading 5 private entity variables (fetched from datastore) Fetching as many as 20 static variables (from datastore or persisted in server memory) Writing 5 entity variables Clients of the game would authenticate and set state directly against GAE as well as pull the latest world state from GAE. Running the update on GAE would consist of a cron job launched every minute. This would update all of the entities and save the results to the datastore. This would be more CPU intensive for GAE. Running the update on an authoritative server would consist of fetching entity data from the GAE datastore, calculating the new entity states and pushing the new state variables back to the datastore. This would be more bandwidth intensive for the datastore.

    Read the article

  • Faster, Simpler access to Azure Tables with Enzo Azure API

    - by Herve Roggero
    After developing the latest version of Enzo Cloud Backup I took the time to create an API that would simplify access to Azure Tables (the Enzo Azure API). At first, my goal was to make the code simpler compared to the Microsoft Azure SDK. But as it turns out it is also a little faster; and when using the specialized methods (the fetch strategies) it is much faster out of the box than the Microsoft SDK, unless you start creating complex parallel and resilient routines yourself. Last but not least, I decided to add a few extension methods that I think you will find attractive, such as the ability to transform a list of entities into a DataTable. So let’s review each area in more details. Simpler Code My first objective was to make the API much easier to use than the Azure SDK. I wanted to reduce the amount of code necessary to fetch entities, remove the code needed to add automatic retries and handle transient conditions, and give additional control, such as a way to cancel operations, obtain basic statistics on the calls, and control the maximum number of REST calls the API generates in an attempt to avoid throttling conditions in the first place (something you cannot do with the Azure SDK at this time). Strongly Typed Before diving into the code, the following examples rely on a strongly typed class called MyData. The way MyData is defined for the Azure SDK is similar to the Enzo Azure API, with the exception that they inherit from different classes. With the Azure SDK, classes that represent entities must inherit from TableServiceEntity, while classes with the Enzo Azure API must inherit from BaseAzureTable or implement a specific interface. // With the SDK public class MyData1 : TableServiceEntity {     public string Message { get; set; }     public string Level { get; set; }     public string Severity { get; set; } } //  With the Enzo Azure API public class MyData2 : BaseAzureTable {     public string Message { get; set; }     public string Level { get; set; }     public string Severity { get; set; } } Simpler Code Now that the classes representing an Azure Table entity are defined, let’s review the methods that the Azure SDK would look like when fetching all the entities from an Azure Table (note the use of a few variables: the _tableName variable stores the name of the Azure Table, and the ConnectionString property returns the connection string for the Storage Account containing the table): // With the Azure SDK public List<MyData1> FetchAllEntities() {      CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionString);      CloudTableClient tableClient = storageAccount.CreateCloudTableClient();      TableServiceContext serviceContext = tableClient.GetDataServiceContext();      CloudTableQuery<MyData1> partitionQuery =         (from e in serviceContext.CreateQuery<MyData1>(_tableName)         select new MyData1()         {            PartitionKey = e.PartitionKey,            RowKey = e.RowKey,            Timestamp = e.Timestamp,            Message = e.Message,            Level = e.Level,            Severity = e.Severity            }).AsTableServiceQuery<MyData1>();        return partitionQuery.ToList();  } This code gives you automatic retries because the AsTableServiceQuery does that for you. Also, note that this method is strongly-typed because it is using LINQ. Although this doesn’t look like too much code at first glance, you are actually mapping the strongly-typed object manually. So for larger entities, with dozens of properties, your code will grow. And from a maintenance standpoint, when a new property is added, you may need to change the mapping code. You will also note that the mapping being performed is optional; it is desired when you want to retrieve specific properties of the entities (not all) to reduce the network traffic. If you do not specify the properties you want, all the properties will be returned; in this example we are returning the Message, Level and Severity properties (in addition to the required PartitionKey, RowKey and Timestamp). The Enzo Azure API does the mapping automatically and also handles automatic reties when fetching entities. The equivalent code to fetch all the entities (with the same three properties) from the same Azure Table looks like this: // With the Enzo Azure API public List<MyData2> FetchAllEntities() {        AzureTable at = new AzureTable(_accountName, _accountKey, _ssl, _tableName);        List<MyData2> res = at.Fetch<MyData2>("", "Message,Level,Severity");        return res; } As you can see, the Enzo Azure API returns the entities already strongly typed, so there is no need to map the output. Also, the Enzo Azure API makes it easy to specify the list of properties to return, and to specify a filter as well (no filter was provided in this example; the filter is passed as the first parameter).  Fetch Strategies Both approaches discussed above fetch the data sequentially. In addition to the linear/sequential fetch methods, the Enzo Azure API provides specific fetch strategies. Fetch strategies are designed to prepare a set of REST calls, executed in parallel, in a way that performs faster that if you were to fetch the data sequentially. For example, if the PartitionKey is a GUID string, you could prepare multiple calls, providing appropriate filters ([‘a’, ‘b’[, [‘b’, ‘c’[, [‘c’, ‘d[, …), and send those calls in parallel. As you can imagine, the code necessary to create these requests would be fairly large. With the Enzo Azure API, two strategies are provided out of the box: the GUID and List strategies. If you are interested in how these strategies work, see the Enzo Azure API Online Help. Here is an example code that performs parallel requests using the GUID strategy (which executes more than 2 t o3 times faster than the sequential methods discussed previously): public List<MyData2> FetchAllEntitiesGUID() {     AzureTable at = new AzureTable(_accountName, _accountKey, _ssl, _tableName);     List<MyData2> res = at.FetchWithGuid<MyData2>("", "Message,Level,Severity");     return res; } Faster Results With Sequential Fetch Methods Developing a faster API wasn’t a primary objective; but it appears that the performance tests performed with the Enzo Azure API deliver the data a little faster out of the box (5%-10% on average, and sometimes to up 50% faster) with the sequential fetch methods. Although the amount of data is the same regardless of the approach (and the REST calls are almost exactly identical), the object mapping approach is different. So it is likely that the slight performance increase is due to a lighter API. Using LINQ offers many advantages and tremendous flexibility; nevertheless when fetching data it seems that the Enzo Azure API delivers faster.  For example, the same code previously discussed delivered the following results when fetching 3,000 entities (about 1KB each). The average elapsed time shows that the Azure SDK returned the 3000 entities in about 5.9 seconds on average, while the Enzo Azure API took 4.2 seconds on average (39% improvement). With Fetch Strategies When using the fetch strategies we are no longer comparing apples to apples; the Azure SDK is not designed to implement fetch strategies out of the box, so you would need to code the strategies yourself. Nevertheless I wanted to provide out of the box capabilities, and as a result you see a test that returned about 10,000 entities (1KB each entity), and an average execution time over 5 runs. The Azure SDK implemented a sequential fetch while the Enzo Azure API implemented the List fetch strategy. The fetch strategy was 2.3 times faster. Note that the following test hit a limit on my network bandwidth quickly (3.56Mbps), so the results of the fetch strategy is significantly below what it could be with a higher bandwidth. Additional Methods The API wouldn’t be complete without support for a few important methods other than the fetch methods discussed previously. The Enzo Azure API offers these additional capabilities: - Support for batch updates, deletes and inserts - Conversion of entities to DataRow, and List<> to a DataTable - Extension methods for Delete, Merge, Update, Insert - Support for asynchronous calls and cancellation - Support for fetch statistics (total bytes, total REST calls, retries…) For more information, visit http://www.bluesyntax.net or go directly to the Enzo Azure API page (http://www.bluesyntax.net/EnzoAzureAPI.aspx). About Herve Roggero Herve Roggero, Windows Azure MVP, is the founder of Blue Syntax Consulting, a company specialized in cloud computing products and services. Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" from Apress and runs the Azure Florida Association (on LinkedIn: http://www.linkedin.com/groups?gid=4177626). For more information on Blue Syntax Consulting, visit www.bluesyntax.net.

    Read the article

  • Eclipse will not install Android Support Package [migrated]

    - by Michael Mossman
    I am an experienced programmer but new to Android, hence using Eclipse for the first time. Unfortunately I cannot get my first project off the ground. I get as far trying to create the project when the following sequence happens. A screen arrives titled 'Install Dependencies' telling me that the required version is 8 and the installed version is 'Not installed' I click on the Install/Upgrade button and something must happen because Android SDK Manager pops up and I get a whole series of 'Fetching:' messages. The next screen asks me to Choose Package to Install and the only option is Android Support Library, revision 10. This doesn't bother me as I presume 10 is better than 8 so I hit the Install button. This is where it all goes wrong as I get a red message saying: File not found: C:\Program Files (X86)\Android\android-sdk\temp\support_r10.zip (Access is denied). Sure enough, when I check this folder, it is empty. Now, something must have happened as as SDK Manager did a whole lot of fetching and must have put these files somewhere. I am quite happy to find them and put them in the correct folder but the question is what is the name of the files that I am looking for?

    Read the article

  • hosting simple python scripts in a container to handle concurrency, configuration, caching, etc.

    - by Justin Grant
    My first real-world Python project is to write a simple framework (or re-use/adapt an existing one) which can wrap small python scripts (which are used to gather custom data for a monitoring tool) with a "container" to handle boilerplate tasks like: fetching a script's configuration from a file (and keeping that info up to date if the file changes and handle decryption of sensitive config data) running multiple instances of the same script in different threads instead of spinning up a new process for each one expose an API for caching expensive data and storing persistent state from one script invocation to the next Today, script authors must handle the issues above, which usually means that most script authors don't handle them correctly, causing bugs and performance problems. In addition to avoiding bugs, we want a solution which lowers the bar to create and maintain scripts, especially given that many script authors may not be trained programmers. Below are examples of the API I've been thinking of, and which I'm looking to get your feedback about. A scripter would need to build a single method which takes (as input) the configuration that the script needs to do its job, and either returns a python object or calls a method to stream back data in chunks. Optionally, a scripter could supply methods to handle startup and/or shutdown tasks. HTTP-fetching script example (in pseudocode, omitting the actual data-fetching details to focus on the container's API): def run (config, context, cache) : results = http_library_call (config.url, config.http_method, config.username, config.password, ...) return { html : results.html, status_code : results.status, headers : results.response_headers } def init(config, context, cache) : config.max_threads = 20 # up to 20 URLs at one time (per process) config.max_processes = 3 # launch up to 3 concurrent processes config.keepalive = 1200 # keep process alive for 10 mins without another call config.process_recycle.requests = 1000 # restart the process every 1000 requests (to avoid leaks) config.kill_timeout = 600 # kill the process if any call lasts longer than 10 minutes Database-data fetching script example might look like this (in pseudocode): def run (config, context, cache) : expensive = context.cache["something_expensive"] for record in db_library_call (expensive, context.checkpoint, config.connection_string) : context.log (record, "logDate") # log all properties, optionally specify name of timestamp property last_date = record["logDate"] context.checkpoint = last_date # persistent checkpoint, used next time through def init(config, context, cache) : cache["something_expensive"] = get_expensive_thing() def shutdown(config, context, cache) : expensive = cache["something_expensive"] expensive.release_me() Is this API appropriately "pythonic", or are there things I should do to make this more natural to the Python scripter? (I'm more familiar with building C++/C#/Java APIs so I suspect I'm missing useful Python idioms.) Specific questions: is it natural to pass a "config" object into a method and ask the callee to set various configuration options? Or is there another preferred way to do this? when a callee needs to stream data back to its caller, is a method like context.log() (see above) appropriate, or should I be using yield instead? (yeild seems natural, but I worry it'd be over the head of most scripters) My approach requires scripts to define functions with predefined names (e.g. "run", "init", "shutdown"). Is this a good way to do it? If not, what other mechanism would be more natural? I'm passing the same config, context, cache parameters into every method. Would it be better to use a single "context" parameter instead? Would it be better to use global variables instead? Finally, are there existing libraries you'd recommend to make this kind of simple "script-running container" easier to write?

    Read the article

  • JSF 2.0 Problem

    - by Sarang
    I am doing a project where I am using JSF 2.0 & Primefaces UI Components. There is a tab view component with tabs, "Day","Week" & "Month". In all tab, I have to display Bar Charts in each. For the same, I am fetching three list using the following three method. In the following code, UpdateCountHelper is fetching the data from database. So, UpdateCountHelper is taking some time for fetching data. This is code for fetching lists : public List<UpdateCount> getDayUpdateCounts() { if (projectFlag == true) { if (displayFlag == 1) { dayUpdateCounts = UpdateCountHelper.getProjectUpdates(1); } else { dayUpdateCounts = UpdateCountHelper.getProjectUpdates(name, 1); } } else { dayUpdateCounts = UpdateCountHelper.getResourceUpdates(userName, 1); } return dayUpdateCounts; } public List<UpdateCount> getMonthUpdateCounts() { if (projectFlag == true) { if (displayFlag == 1) { monthUpdateCounts = UpdateCountHelper.getProjectUpdates(30); } else { monthUpdateCounts = UpdateCountHelper.getProjectUpdates(name, 30); } } else { monthUpdateCounts = UpdateCountHelper.getResourceUpdates(userName, 30); } return monthUpdateCounts; } public List<UpdateCount> getWeekUpdateCounts() { if (projectFlag == true) { if (displayFlag == 1) { weekUpdateCounts = UpdateCountHelper.getProjectUpdates(7); } else { weekUpdateCounts = UpdateCountHelper.getProjectUpdates(name, 7); } } else { weekUpdateCounts = UpdateCountHelper.getResourceUpdates(userName, 7); } return weekUpdateCounts; } This is code for UI of Bar Chart : <p:panel id="Chart"> <p:tabView dynamic="false" cache="false"> <p:tab title="Day"> <p:panel id="chartDayPanel"> <center> <h:outputText id="projectWiseDayText" rendered="#{systemUtilizationServiceBean.projectFlag}" value="Project Wise Day Update"/> <p:columnChart id="projectWiseDayUpdateChart" rendered="#{systemUtilizationServiceBean.projectFlag}" value="#{systemUtilizationServiceBean.dayUpdateCounts}" var="dayWiseUpdate" xfield="#{dayWiseUpdate.name}" height="200px" width="640px"> <p:chartSeries label="Project Wise Current Day Update" value="#{dayWiseUpdate.noUpdates}"/> </p:columnChart> <h:outputText id="resourceWiseDayText" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="Resource Wise Day Update"/> <p:columnChart id="resourceWiseDayUpdateChart" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="#{systemUtilizationServiceBean.dayUpdateCounts}" var="dayWiseResourceUpdate" xfield="#{dayWiseResourceUpdate.name}" height="200px" width="640px"> <p:chartSeries label="Resource Wise Current Day Update" value="#{dayWiseResourceUpdate.noUpdates}"/> </p:columnChart> </center> </p:panel> </p:tab> <p:tab title="Week"> <p:panel id="chartWeekPanel"> <center> <h:outputText id="projectWiseWeekText" rendered="#{systemUtilizationServiceBean.projectFlag}" value="Project Wise Week Update"/> <p:columnChart id="projectWiseWeekUpdateChart" rendered="#{systemUtilizationServiceBean.projectFlag}" value="#{systemUtilizationServiceBean.weekUpdateCounts}" var="weekWiseUpdate" xfield="#{weekWiseUpdate.name}" height="200px" width="640px"> <p:chartSeries label="Project Wise Current Week Update" value="#{weekWiseUpdate.noUpdates}"/> </p:columnChart> <h:outputText id="resourceWiseWeekText" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="Resource Wise Week Update"/> <p:columnChart id="resourceWiseWeekUpdateChart" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="#{systemUtilizationServiceBean.weekUpdateCounts}" var="weekWiseResourceUpdate" xfield="#{weekWiseResourceUpdate.name}" height="200px" width="640px"> <p:chartSeries label="Resource Wise Current Week Update" value="#{weekWiseResourceUpdate.noUpdates}"/> </p:columnChart> </center> </p:panel> </p:tab> <p:tab title="Month"> <p:panel id="chartMonthPanel"> <center> <h:outputText id="projectWiseMonthText" rendered="#{systemUtilizationServiceBean.projectFlag}" value="Project Wise Month Update"/> <p:columnChart id="projectWiseMonthUpdateChart" rendered="#{systemUtilizationServiceBean.projectFlag}" value="#{systemUtilizationServiceBean.monthUpdateCounts}" var="monthWiseUpdate" xfield="#{monthWiseUpdate.name}" height="200px" width="640px"> <p:chartSeries label="Project Wise Current Month Update" value="#{monthWiseUpdate.noUpdates}"/> </p:columnChart> <h:outputText id="resourceWiseMonthText" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="Resource Wise Month Update"/> <p:columnChart id="resourceWiseMonthUpdateChart" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="#{systemUtilizationServiceBean.monthUpdateCounts}" var="monthWiseResourceUpdate" xfield="#{monthWiseResourceUpdate.name}" height="200px" width="640px"> <p:chartSeries label="Resource Wise Current Month Update" value="#{monthWiseResourceUpdate.noUpdates}"/> </p:columnChart> </center> </p:panel> </p:tab> </p:tabView> </p:panel> Now, I have to display same data in other tabview with same tabs as mentioned above & only thing is now I have to display in Pie Chart. Now in pie chart, I am using the same lists. So, it will again fetch the data from database & waste time. To solve that problem I have created other three lists & have given only reference of those previous lists. So, now no database fetching occur. The Code for applying the reference is : public List<UpdateCount> getPieDayUpdateCounts() { pieDayUpdateCounts = dayUpdateCounts; return pieDayUpdateCounts; } public List<UpdateCount> getPieMonthUpdateCounts() { pieMonthUpdateCounts = monthUpdateCounts; return pieMonthUpdateCounts; } public List<UpdateCount> getPieWeekUpdateCounts() { pieWeekUpdateCounts = weekUpdateCounts; return pieWeekUpdateCounts; } But, over here the problem occurring is that only chart of which the tab is enable is displayed but the other remaining 2 tabs are not showing any chart. The code for UI is : <p:tabView dynamic="false" cache="false"> <p:tab title="Day"> <center> <p:panel id="pieChartDayPanel"> <h:outputText id="projectWiseDayPieChartText" rendered="#{systemUtilizationServiceBean.projectFlag}" value="Project Wise Day Update"/> <p:pieChart id="projectWiseDayUpdatePieChart" rendered="#{systemUtilizationServiceBean.projectFlag}" value="#{systemUtilizationServiceBean.dayUpdateCounts}" var="dayWisePieUpdate" categoryField="#{dayWisePieUpdate.name}" dataField="#{dayWisePieUpdate.noUpdates}" height="200" width="200"/> <h:outputText id="resourceWiseDayPieChartText" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="Resource Wise Day Update"/> <p:pieChart id="resourceWiseDayUpdatePieChart" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="#{systemUtilizationServiceBean.dayUpdateCounts}" var="dayWiseResourcePieUpdate" categoryField="#{dayWiseResourcePieUpdate.name}" dataField="#{dayWiseResourcePieUpdate.noUpdates}" height="200" width="200"/> </p:panel> </center> </p:tab> <p:tab title="Week"> <center> <p:panel id="pieChartWeekPanel"> <h:outputText id="projectWiseWeekPieChartText" rendered="#{systemUtilizationServiceBean.projectFlag}" value="Project Wise Week Update"/> <p:pieChart id="projectWiseWeekUpdatePieChart" rendered="#{systemUtilizationServiceBean.projectFlag}" value="#{systemUtilizationServiceBean.weekUpdateCounts}" var="weekWisePieUpdate" categoryField="#{weekWisePieUpdate.name}" dataField="#{weekWisePieUpdate.noUpdates}" height="200" width="200"/> <h:outputText id="resourceWiseWeekPieChartText" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="Resource Wise Week Update"/> <p:pieChart id="resourceWiseWeekUpdatePieChart" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="#{systemUtilizationServiceBean.weekUpdateCounts}" var="weekWiseResourcePieUpdate" categoryField="#{weekWiseResourcePieUpdate.name}" dataField="#{weekWiseResourcePieUpdate.noUpdates}" height="200" width="200"/> </p:panel> </center> </p:tab> <p:tab title="Month"> <center> <p:panel id="pieChartMonthPanel"> <h:outputText id="projectWiseMonthPieChartText" rendered="#{systemUtilizationServiceBean.projectFlag}" value="Project Wise Month Update"/> <p:pieChart id="projectWiseMonthUpdatePieChart" rendered="#{systemUtilizationServiceBean.projectFlag}" value="#{systemUtilizationServiceBean.monthUpdateCounts}" var="monthWisePieUpdate" categoryField="#{monthWisePieUpdate.name}" dataField="#{monthWisePieUpdate.noUpdates}" height="200" width="200"/> <h:outputText id="resourceWiseMonthPieChartText" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="Resource Wise Month Update"/> <p:pieChart id="resourceWiseMonthUpdatePieChart" rendered="#{systemUtilizationServiceBean.resourceFlag}" value="#{systemUtilizationServiceBean.monthUpdateCounts}" var="monthWiseResourcePieUpdate" categoryField="#{monthWiseResourcePieUpdate.name}" dataField="#{monthWiseResourcePieUpdate.noUpdates}" height="200" width="200"/> </p:panel> </center> </p:tab> </p:tabView> What should be the reason behind this ?

    Read the article

  • Heroku- Could not find paperclip-3.1.3 in any of the sources

    - by otchkcom
    This morning when I tried to update my website, heroku didn't let me push the app. Here's the message I got. Fetching gem metadata from http://rubygems.org/....... Fetching gem metadata from http://rubygems.org/.. Fetching git://github.com/drhenner/nifty-generators.git Could not find paperclip-3.1.3 in any of the sources ! ! Failed to install gems via Bundler. ! ! Heroku push rejected, failed to compile Ruby/rails app ! [remote rejected] master -> master (pre-receive hook declined) I don't have paperclip- 3.1.3 in my gem file. I'm not sure why it's looking for paperclip 3.1.3 Here's my gem file source 'http://rubygems.org' gem 'rails', '~> 3.2.6' gem 'asset_sync' group :assets do gem 'uglifier', '>= 1.0.3' end gem 'sass-rails', " ~> 3.2.3" gem "activemerchant", '~> 1.17.0' #, :lib => 'active_merchant' gem 'authlogic', "3.0.3" gem 'bluecloth', '~> 2.1.0' gem 'cancan', '~> 1.6.7' gem 'compass', '~> 0.12.rc.0' gem 'compass-rails' gem 'dalli', '~> 1.1.5' gem "friendly_id", "~> 3.3" gem 'haml', ">= 3.0.13"#, ">= 3.0.4"#, "2.2.21"#, gem "jquery-rails" gem 'aws-sdk' group :production do gem 'pg' gem 'thin' end gem 'nested_set', '~> 1.6.3' gem 'nokogiri', '~> 1.5.0' gem 'paperclip', '~> 3.0' gem 'prawn', '~> 0.12.0' gem 'rails3-generators', '~> 0.17.0' gem 'rmagick', :require => 'RMagick' gem 'rake', '~> 0.9.2' gem 'state_machine', '~> 1.1.2' gem 'sunspot_solr' gem 'sunspot_rails', '~> 1.3.0rc' gem 'will_paginate', '~> 3.0.0' gem 'dynamic_form' group :development do gem 'sqlite3' gem "autotest-rails-pure" gem "rails-erd" gem "ruby-debug19" end group :test, :development do gem "rspec-rails", "~> 2.8.0" gem 'capybara', :git => 'git://github.com/jnicklas/capybara.git' gem 'launchy' gem 'database_cleaner' end group :test do gem 'factory_girl', "~> 3.3.0" gem 'factory_girl_rails', "~> 3.3.0" gem 'mocha', '~> 0.10.0', :require => false gem 'rspec-rails-mocha' gem "rspec", "~> 2.8.0" gem "rspec-core", "~> 2.8.0" gem "rspec-expectations", "~> 2.8.0" gem "rspec-mocks", "~> 2.8.0" gem 'email_spec' gem "faker" gem "autotest", '~> 4.4.6' gem "autotest-rails-pure" gem "autotest-growl" gem "ZenTest", '4.6.2' end

    Read the article

  • Permission Problem While Installing Module With CPAN

    - by neversaint
    I tried to following module using CPAN, but the message I get is the "I have neither the -x permission ..." . How can I resolve that? cpan[3]> install List::MoreUtils is it OK to try to connect to the Internet? [yes] Fetching with LWP: http://www.perl.org/CPAN/authors/id/V/VP/VPARSEVAL/List-MoreUtils-0.22.tar.gz CPAN: Digest::SHA loaded ok (v5.48) Fetching with LWP: http://www.perl.org/CPAN/authors/id/V/VP/VPARSEVAL/CHECKSUMS Checksum for /home/ewijaya/.cpan/sources/authors/id/V/VP/VPARSEVAL/List-MoreUtils-0.22.tar.gz ok Scanning cache /home/neversaint/.cpan/build for sizes .....I have neither the -x permission nor the permission to change the permission; cannot estimate disk usage of '/home/neversaint/.cpan/build/Module-Build-0.3607-Kvb1Vq' .I have neither the -x permission nor the permission to change the permission; cannot estimate disk usage of '/home/neversaint/.cpan/build/ExtUtils-ParseXS-2.2205-zuX4x2' ^CCaught SIGINT, trying to continue

    Read the article

  • Amazon like Ecommerce site and Recommendation system

    - by Hellnar
    Hello, I am planning to implement a basic recommendation system that uses Facebook Connect or similar social networking site API's to connect a users profile, based on tags do an analyze and by using the results, generate item recommendations on my e-commerce site(works similar to Amazon). I do believe I need to divide parts into such: Fetching social networking data via API's.(Indeed user allows this) Analyze these data and generate tokes. By using information tokens, do item recommendations on my e-commerce site. Ie: I am a fan of "The Strokes" band on my Facebook account, system analyze this and recommending me "The Strokes Live" CD. For any part(fetching data, doing recommendation based on tags...), what algorithm and method would you recommend/ is used ? Thanks

    Read the article

  • Python script run via cron does not execute occassionally

    - by gcorne
    I have a simple python script for fetching tweets and caching them to disk that is configured to run every two minutes via cron. */2 * * * * (date ; /usr/bin/python /path/get_tweets.py) >> /path/log/get_tweets.log 2>&1 The script runs successfully most of the time. However, every so often the script doesn't execute. In addition to other logging, I added a simple print statement above the meat of the script and nothing except the output from the initial date command makes it to the log. #!/usr/bin/python # Script for fetching tweets and then storing them as an HTML snippet for inclusion using SSI print "Starting get_tweets.py" Any ideas? The system is a VPS running Centos 5.3 with python 2.4.

    Read the article

  • Phonegap and JqueryMobile freeze UI events and functions

    - by techytee
    I am developing a Phonegap app with JQuery mobile with their latest stable versinos (Phonegap 3 and JQM 1.3.2) for Android platform. My app downloads feeds from Google Feeds API and saves in a SQLite database. But whenever it starts fetching and saving data from the web (the no of feeds that download at a time can be many), the other functionality such as button events halt and freeze. To be precise the a button that opens and closes a panel does not either open or close the panel until the data fetching stops. How am I supposed to solve this issue? The performance has dropped drastically due to this issue in my app.

    Read the article

  • Sending data between activities within my app?

    - by user246114
    Hi, I have a TabActivity, and the tabs point to sub activities. Is there a way I can send a 'message' to those child activities? I just want to pass a string across, not sure if this is possible. I have some data being fetched by the parent TabActivity, and the child tabs can't do anything useful until the parent is done fetching. When fetching is complete, I'd like to pass that data to the child activities so they can do something useful with it. Notmally I'd set the data to be passed in the Intent when first creating the activity, but in this case I can't do that. Thanks

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >