Search Results

Search found 83878 results on 3356 pages for 'google data api'.

Page 225/3356 | < Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >

  • Majordomo/Mailman - Yahoo/Google Groups

    - by tom smith
    Hi. Not an app question, but thought that I might ask here anyway! The responses might help someone. I've got an app where we're going to be dealing with 5000-10000 people in a group/pool. Periodically, different subsets of people will break off in their own group. I'm looking for thoughts on how to manage/approach this situation. (For now, all of this is being managed on a few servers in the garage, with dynamic IP) I've looked into the Yahoo/google groups, and they seem to be reasonable. The primary issue that I see with this approach, is that I don't have a good way of quickly/easily alowing a subset of the group to form their own group for a given project. This kind of function is critical. The upside to this though, I wouldn't have to really set anything up. And the hosted groups could send emails to the user all day along, without running into cap/bandwidth limits The other approach is a managed list, like mailman/majordomo. This approach appears to be more flexible, and looks like ti could be modified to handle the quick creation of lists, allowing users to quickly be assigned to different lists on the fly.. The downside, I'd have to run my own Mailman/Majordomo instance, as well as the associated mail server. Or I could look at possibly using one of the hosted service.. But this is for a project on the cheap, so we're really trying to keep costs down. Thoughts/Comments/Pointers would be greatly appreciated. Thanks in advance -tom

    Read the article

  • Build ATOM Feed Reader for ADO.net DATA Services feed

    - by khalil
    Hi, I have built an ADO.net data services to expose data in a SQL server database as XML. What I want to be able to do is create a feed reader for this ATOM feed in .net or may be a user control which subscribes to this URI based ATOM Feed from ADO.net data service & publishes the latest information on our website

    Read the article

  • How to return plain XML from ADO.NET data service

    - by KHALIL
    Hi, I was wondering how to return plain XML from ADO.net data services I have exposed an ADO.net data service to different DEPARTMENTS in our company who are not so technical. The data returned is ATOM FEED which is kind a hard to read / interpret with its format, too much information is returned people from various departments would execute different queries ( HTTP Request) and i wanted them to display simple XML or atleast something more user friendly like HTML I have tried ACCEPT attribute of the request to be plain XML and it still returns ATOM Thanks -- Khalil

    Read the article

  • Authenticating to Google Search Appliance using Basic HTTP auth and ASP.NET (VB)

    - by Chainlink
    I've run into a snag though which has to do with authentication between the Google Search Appliance and ASP. Normally, when asking for secure pages from the search appliance, the search appliance asks for credentials, then uses these credentials to try and access the secure results. If this attempt is successful, the page shows up in the results list. Since ASP is contacting the search appliance on the client's behalf, it will need to collect credentials and pass them along to the search appliance. I have tried a couple of different documented ways of accomplishing this, but they don't seem to work. Below is the code I have tried: 'Bypass SSL since discovery.gov.mb.ca does not have valid SSL cert (NOT PRODUCTION SAFE) ServerCertificateValidationCallback = New System.Net.Security.RemoteCertificateValidationCallback(AddressOf customXertificateValidation) googleUrl = "https://removed.com" Dim rdr As New XmlTextReader(googleUrl) Dim resolver As New XmlUrlResolver() Dim myCred As New System.Net.NetworkCredential("USERNAME", "PASSWORD", Nothing) Dim credCache As New CredentialCache() credCache.Add(New Uri(googleUrl), "Basic", myCred) resolver.Credentials = credCache rdr.XmlResolver = resolver doc = New System.Xml.XPath.XPathDocument(rdr) path = doc.CreateNavigator() Private Function customXertificateValidation(ByVal sender As Object, ByVal certificate As System.Security.Cryptography.X509Certificates.X509Certificate, ByVal chain As System.Security.Cryptography.X509Certificates.X509Chain, ByVal sslPolicyErrors As Net.Security.SslPolicyErrors) As Boolean Return True End Function

    Read the article

  • Parse and transform XML with missing elements into table structure

    - by dnlbrky
    I'm trying to parse an XML file. A simplified version of it looks like this: x <- '<grandparent><parent><child1>ABC123</child1><child2>1381956044</child2></parent><parent><child2>1397527137</child2></parent><parent><child3>4675</child3></parent><parent><child1>DEF456</child1><child3>3735</child3></parent><parent><child1/><child3>3735</child3></parent></grandparent>' library(XML) xmlRoot(xmlTreeParse(x)) ## <grandparent> ## <parent> ## <child1>ABC123</child1> ## <child2>1381956044</child2> ## </parent> ## <parent> ## <child2>1397527137</child2> ## </parent> ## <parent> ## <child3>4675</child3> ## </parent> ## <parent> ## <child1>DEF456</child1> ## <child3>3735</child3> ## </parent> ## <parent> ## <child1/> ## <child3>3735</child3> ## </parent> ## </grandparent> I'd like to transform the XML into a data.frame / data.table that looks like this: parent <- data.frame(child1=c("ABC123",NA,NA,"DEF456",NA), child2=c(1381956044, 1397527137, rep(NA, 3)), child3=c(rep(NA, 2), 4675, 3735, 3735)) parent ## child1 child2 child3 ## 1 ABC123 1381956044 NA ## 2 <NA> 1397527137 NA ## 3 <NA> NA 4675 ## 4 DEF456 NA 3735 ## 5 <NA> NA 3735 If each parent node always contained all of the possible elements ("child1", "child2", "child3", etc.), I could use xmlToList and unlist to flatten it, and then dcast to put it into a table. But the XML often has missing child elements. Here is an attempt with incorrect output: library(data.table) ## Flatten: dt <- as.data.table(unlist(xmlToList(x)), keep.rownames=T) setnames(dt, c("column", "value")) ## Add row numbers, but they're incorrect due to missing XML elements: dt[, row:=.SD[,.I], by=column][] column value row 1: parent.child1 ABC123 1 2: parent.child2 1381956044 1 3: parent.child2 1397527137 2 4: parent.child3 4675 1 5: parent.child1 DEF456 2 6: parent.child3 3735 2 7: parent.child3 3735 3 ## Reshape from long to wide, but some value are in the wrong row: dcast.data.table(dt, row~column, value.var="value", fill=NA) ## row parent.child1 parent.child2 parent.child3 ## 1: 1 ABC123 1381956044 4675 ## 2: 2 DEF456 1397527137 3735 ## 3: 3 NA NA 3735 I won't know ahead of time the names of the child elements, or the count of unique element names for children of the grandparent, so the answer should be flexible.

    Read the article

  • Data Collection (Offline - no internet) and then syncing it to generate reports from server

    - by Nishant
    So, I have a new project I am planning on taking, and needed to know what skills will be required to achieve this project. The project is to do intensive data collection in the field where they don't have internet access. As part of the data collection, images will be uploaded as part of the data collection which will have to be resized, etc. Once the data collection occurs, this data needs to be consolidated and reported on. I am thinking there are two ways of generating the report. 1. Into a PDF that can be designed. 2. Is there a way to generate an executable file (since the PDF will be huge due to multiple images, etc) and the executable file is navigation friendly with drop-downs etc. It might not be an executable file, but could be a web page or some way that this can be delivered to the client in a friendly professional way. The PDF will have to be generated somehow so that it can be printed as a hard copy. What languages and skill sets will I need to accomplish this project?

    Read the article

  • Can Core Data be used on Linux?

    - by glenc
    This might be a stupid question, but I was wondering whether or not you can use the Core Data libraries on Linux at all? I'm planning how to build the server side of an iPhone app that I'm working on, and have found that you can use PyObjC to get access to Core Data in a Python environment, e.g. use Core Data in a TurboGears web application. At this point I'm thinking that you would have to run the web server on Mac OSX, because I can't find any evidence on the internet that you can access the Objective-C libraries on Linux. I've always written webapps on Linux but will obviously make the jump to an OSX server if it allows me to use the same datastore implementation on the iPhone and the server, the only job remaining being the Core Data <- Web Services XML translation that has to happen on the wire.

    Read the article

  • Selecting element by data attribute

    - by zizo
    Is there an easy and straight-forward method to select elements based on their data attribute? For example, select all anchors that has data attribute named customerID which has value of 22. I am kind of hesitant to use rel or other attributes to store such information, but I find it much harder to select an element based on what data is stored in it. Thanks!

    Read the article

  • Shipping Java code with data baked into the .jar

    - by Andrew
    I need to ship some Java code that has an associated set of data. It's a simulator for a device, and I want to be able to include all of the data used for the simulated records in the one .JAR file. In this case, each simulated record contains four fields (calling party, called party, start of call, call duration). What's the best way to do that? I've gone down the path of generating the data as Java statements, but IntelliJ doesn't seem particularly happy dealing with a 100,000 line Java source file! Is there a smarter way to do this? In the C#/.NET world I'd create the data as a separate file, embed it in the assembly as a resource, and then use reflection to pull that out at runtime and access it. I'm unsure of what the appropriate analogy is in the Java world. FWIW, Java 1.6, shipping for Solaris.

    Read the article

  • R: convert data.frame columns from factors to characters

    - by Mike Dewar
    Hi, I have a data frame. Let's call him bob: > head(bob) phenotype exclusion GSM399350 3- 4- 8- 25- 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399351 3- 4- 8- 25- 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399352 3- 4- 8- 25- 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399353 3- 4- 8- 25+ 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399354 3- 4- 8- 25+ 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399355 3- 4- 8- 25+ 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- I'd like to concatenate the rows of this data frame (this will be another question). But look: > class(bob$phenotype) [1] "factor" Bob's columns are factors. So, for example: > as.character(head(bob)) [1] "c(3, 3, 3, 6, 6, 6)" "c(3, 3, 3, 3, 3, 3)" [3] "c(29, 29, 29, 30, 30, 30)" I don't begin to understand this, but I guess these are indices into the levels of the factors of the columns (of the court of king caractacus) of bob? Not what I need. Strangely I can go through the columns of bob by hand, and do bob$phenotype <- as.character(bob$phenotype) which works fine. And, after some typing, I can get a data.frame whose columns are characters rather than factors. So my question is: how can I do this automatically? How do I convert a data.frame with factor columns into a data.frame with character columns without having to manually go through each column? Bonus question: why does the manual approach work?

    Read the article

  • Set initial view with SkpWriter in Google Sketchup C++ SDK

    - by Peter Olsson
    How do you set the initial view for the model in an SKP file created with the SkpWriter in Google Sketchup C++ SDK? There has been an example in an older version of the SDK. Part of the source is posted here. I'm trying to use: m_pDoc->GetModel()->SetCamera(cameraDefn); The problem is that I'm not able to create a valid atlast::sketchup::CCameraDefinition. Non of the examples in the above post works: atlast::sketchup::CCameraDefinition cameraDefn; cameraDefn.Set(atlast::geometry::CPoint3d(793.838, -1262.6, 2603.16), atlast::geometry::CPoint3d(567.977, 338.199, 398.932), atlast::geometry::CUnitVector3d(-0.112657, 0.798459, 0.591415)); and: atlast::sketchup::CCameraDefinition cameraDefn; cameraDefn.Set(atlast::geometry::CPoint3d(793.838, -1262.6, 2603.16), atlast::geometry::CPoint3d(567.977, 338.199, 398.932), atlast::geometry::CUnitVector3d(-0.112657, 0.798459, 0.591415)); In the end I want the initial view to be the view you get from pressing the icon for Zoom extents followed by the Iso icon (the other way around is also ok). Right now I would settle for creating a valid atlast::sketchup::CCameraDefinition. Any better way to achieve this in the SKP-file?

    Read the article

  • Get control instance in asp.net dynamic data

    - by Ashwani K
    Hello All: I am creating a web application using Asp.net dynamic data. I am using GridView to show data from the database. In the grid view I am having following code for columns <Columns> <asp:DynamicField DataField="UserId" UIHint="Label" /> <asp:DynamicField DataField="Address" UIHint="Address"/> <asp:DynamicField DataField="CreatedDate" UIHint="Label" /> </Columns> But, before displaying I want to do some processing in C# code for each row. In normal ASP.net grid view we can handle OnRowDataBound method, and using FindControl("controlid") we can get the control instance, but in case of dynamic data, I am not getting any id attribute for columns, so I am not able to get the control instance to show updated data in that control depending on some conditions. Thanks, Ashwani

    Read the article

  • Neo4j Reading data / performing shortest path calculations on stored data

    - by paddydub
    I'm using the Batch_Insert example to insert Data into the database How can i read this data back from the database. I can't find any examples of how i do this. public static void CreateData() { // create the batch inserter BatchInserter inserter = new BatchInserterImpl( "var/graphdb", BatchInserterImpl.loadProperties( "var/neo4j.props" ) ); Map<String,Object> properties = new HashMap<String,Object>(); properties.put( "name", "Mr. Andersson" ); properties.put( "age", 29 ); long node1 = inserter.createNode( properties ); properties.put( "name", "Trinity" ); properties.remove( "age" ); long node2 = inserter.createNode( properties ); inserter.createRelationship( node1, node2, DynamicRelationshipType.withName( "KNOWS" ), null ); inserter.shutdown(); } I would like to store graph data in the database, graph.makeEdge( "s", "c", "cost", (double) 7 ); graph.makeEdge( "c", "e", "cost", (double) 7 ); graph.makeEdge( "s", "a", "cost", (double) 2 ); graph.makeEdge( "a", "b", "cost", (double) 7 ); graph.makeEdge( "b", "e", "cost", (double) 2 ); Dijkstra<Double> dijkstra = getDijkstra( graph, 0.0, "s", "e" ); What is the best method to store this kind data with 10000's of edges. Then run the Dijskra algorighm to find shortest path calculations using the stored graph data.

    Read the article

  • Django and Google App Engine Helper not finding the ipaddr module.

    - by Phil
    I'm trying to get Django running on GAE using this tutorial. When I run python manage.py runserver I get the stacktrace below. I'm new to both django and python so I don't know what my next steps are (This is Ubuntu Jaunty btw). It seems django isn't finding the GAE module ipaddr which comes with SDK 1.3.1. How do I get django to find this module? /home/username/bin/google_appengine/google/appengine/api/datastore_file_stub.py:40: DeprecationWarning: the md5 module is deprecated; use hashlib instead import md5 /home/username/bin/google_appengine/google/appengine/api/memcache/__init__.py:31: DeprecationWarning: the sha module is deprecated; use the hashlib module instead import sha Traceback (most recent call last): File "manage.py", line 18, in <module> InstallAppengineHelperForDjango() File "/home/username/Development/GAE/myapp/appengine_django/__init__.py", line 543, in InstallAppengineHelperForDjango InstallDjangoModuleReplacements() File "/home/username/Development/GAE/myapp/appengine_django/__init__.py", line 260, in InstallDjangoModuleReplacements import django.db File "/home/username/Development/GAE/myapp/django/db/__init__.py", line 57, in <module> 'TIME_ZONE': settings.TIME_ZONE, File "/home/username/Development/GAE/myapp/appengine_django/db/base.py", line 117, in __init__ self._setup_stubs() File "/home/username/Development/GAE/myapp/appengine_django/db/base.py", line 128, in _setup_stubs from google.appengine.tools import dev_appserver_main File "/home/username/bin/google_appengine/google/appengine/tools/dev_appserver_main.py", line 82, in <module> from google.appengine.tools import appcfg File "/home/username/bin/google_appengine/google/appengine/tools/appcfg.py", line 53, in <module> from google.appengine.api import dosinfo File "/home/username/bin/google_appengine/google/appengine/api/dosinfo.py", line 25, in <module> import ipaddr ImportError: No module named ipaddr

    Read the article

  • Oracle Data Pump import to a sql file error :ORA-31655 no data or metadata objects

    - by Francisco Quiñones
    Hello, I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window. I made the follow export : expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename and it works, I can see the file TABLESDUMP.DMP in the directory path. then when I tried to import it to a sql file: impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql the log show : ..... ORA-31655 no data or metadata objects selected for job ..... and the sql file is created empty in the directory path. I'm not DBA, I'm a Java developer , Can you help me? Thks

    Read the article

  • can load data(google app enngine) from http://localhost:8100/remote_api ..

    - by zjm1126
    i can download data from gae (http://zjm1126.appspot.com/remote_api), this is code: appcfg.py download_data --application=zjm1126 --url=http://zjm1126.appspot.com/remote_api --filename=a.csv and it successful : D:\zjm_demo\app>appcfg.py download_data --application=zjm1126 --url=http://zjm1 126.appspot.com/remote_api --filename=a.csv Downloading data records. [INFO ] Logging to bulkloader-log-20100618.162421 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 [INFO ] Opening database: bulkloader-progress-20100618.162421.sql3 [INFO ] Opening database: bulkloader-results-20100618.162421.sql3 [INFO ] Connecting to zjm1126.appspot.com/remote_api Please enter login credentials for zjm1126.appspot.com Email: [email protected] Password for [email protected]: [INFO ] Downloading kinds: [u'LogText', u'Greeting', u'Forum', u'Thread'] .... [INFO ] Have 0 entities, 0 previously transferred [INFO ] 0 entities (8804 bytes) transferred in 11.3 seconds so i want to know can load data from 127.0.0.1 , this is my code : appcfg.py download_data --application=zjm1126 --url=http://localhost:8100/remote_api --filename=a.csv and the error is : D:\zjm_demo\app>appcfg.py download_data --application=zjm1126 --url=http://loca lhost:8100/remote_api --filename=a.csv Downloading data records. [INFO ] Logging to bulkloader-log-20100618.162325 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 [INFO ] Opening database: bulkloader-progress-20100618.162325.sql3 [INFO ] Opening database: bulkloader-results-20100618.162325.sql3 Please enter login credentials for localhost Email: [email protected] Password for [email protected]: [INFO ] Connecting to localhost:8100/remote_api [ERROR ] Exception during authentication Traceback (most recent call last): File "d:\Program Files\Google\google_appengine\google\appengine\tools\bulkload er.py", line 3169, in Run self.request_manager.Authenticate() File "d:\Program Files\Google\google_appengine\google\appengine\tools\bulkload er.py", line 1178, in Authenticate remote_api_stub.MaybeInvokeAuthentication() File "d:\Program Files\Google\google_appengine\google\appengine\ext\remote_api \remote_api_stub.py", line 542, in MaybeInvokeAuthentication datastore_stub._server.Send(datastore_stub._path, payload=None) File "d:\Program Files\Google\google_appengine\google\appengine\tools\appengin e_rpc.py", line 346, in Send f = self.opener.open(req) File "D:\Python25\lib\urllib2.py", line 387, in open response = meth(req, response) File "D:\Python25\lib\urllib2.py", line 498, in http_response 'http', request, response, code, msg, hdrs) File "D:\Python25\lib\urllib2.py", line 425, in error return self._call_chain(*args) File "D:\Python25\lib\urllib2.py", line 360, in _call_chain result = func(*args) File "D:\Python25\lib\urllib2.py", line 506, in http_error_default raise HTTPError(req.get_full_url(), code, msg, hdrs, fp) HTTPError: HTTP Error 404: Not Found [INFO ] Authentication Failed so what should i do , thanks

    Read the article

  • Generating video or images of geometrical objects from data

    - by Jonathan Barbero
    Hello, I'm working in a course's project to predict the velocity and position of the solar system planets (and other objects). It will be really cool if I can visualize the predicted objects data, if it's possible generating 3D images, if in video that's amazing. Do you know any library that lets me to use this data to generate an image or video? (I don't care in which language) Data: - simulation step (time line step for a video) - positions of the objects - radius and/or colours of the objects Thanks in advance, any suggestion is welcome.

    Read the article

  • Using MySQL as data source in Microsoft SQL Server Analysis Services

    - by coldilocks
    Hi, I have installed the latest .net connector (http://www.mysql.com/downloads/connector/net/), I can add MySQL databases as Data Sources, I can even browse through the data from Business Intelligence Studio. The problem is that I CANNOT create a datasource view, or if I do create one without tables, trying to add them after the fact gives me the same error. Specifically it looks like the data source view wizard tries to submit queries against the MySQL database using square brackets/braces, and the query bombs. I get an error message like: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[my_db].[cheatType]' at line 2 So, in summary, has anyone been able to create a data source view using MySQL tables and, if so, can they please show me how this can be done. Thanks for any help!

    Read the article

  • Getting an Android App to Show Up in the market for "Sony Internet TV"(Google TV)

    - by user1291659
    I'm having a bit of trouble getting my app to show up in the market under GoogleTV. I've searched google's official documentation and I don't believe the manifest lists any elements which would invalidate the program; the only hardware requirement specified is landscape mode, wakelock and external storage(neither which should cause it to be filtered for GTV according to the documentation) and I set the uses touchscreen elements "required" attribute to false. below is the AndroidManifest.xml for my project: <?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.whateversoft" android:versionCode="2" android:versionName="0.1" > <uses-sdk android:minSdkVersion="8" /> <application android:icon="@drawable/ic_launcher" android:label="Color Shafted" android:theme="@style/Theme.NoBackground" android:debuggable="false"> <activity android:label="Color Shafted" android:name=".colorshafted.ColorShafted" android:configChanges = "keyboard|keyboardHidden|orientation" android:screenOrientation = "landscape"> <!-- Set as the default run activity --> <intent-filter > <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> <activity android:label="Color Shafted Settings" android:name=".colorshafted.Settings" android:theme="@android:style/Theme" android:configChanges = "keyboard|keyboardHidden"> <!-- --> </activity> </application> <!-- DEFINE PERMISSIONS FOR CAPABILITIES --> <uses-permission android:name = "android.permission.WRITE_EXTERNAL_STORAGE"/> <uses-permission android:name = "android.permission.WAKE_LOCK"/> <uses-feature android:name="android.hardware.touchscreen" android:required="false" /> <!-- END OF PERMISSIONS FOR CAPABILITIES --> </manifest> I'm about to start promoting the app after the next major release so its been kind of a bummer since I can't seem to get this to work. Any help would be appreciated, thanks in advance : )

    Read the article

  • Unit testing and mocking email sender in Python with Google AppEngine

    - by CVertex
    I'm a newbie to python and the app engine. I have this code that sends an email based on request params after some auth logic. in my Unit tests (i'm using GAEUnit), how do I confirm an email with specific contents were sent? - i.e. how do I mock the emailer with a fake emailer to verify send was called? class EmailHandler(webapp.RequestHandler): def bad_input(self): self.response.set_status(400) self.response.headers['Content-Type'] = 'text/plain' self.response.out.write("<html><body>bad input </body></html>") def get(self): to_addr = self.request.get("to") subj = self.request.get("subject") msg = self.request.get("body") if not mail.is_email_valid(to_addr): # Return an error message... # self.bad_input() pass # authenticate here message = mail.EmailMessage() message.sender = "[email protected]" message.to = to_addr message.subject = subj message.body = msg message.send() self.response.headers['Content-Type'] = 'text/plain' self.response.out.write("<html><body>success!</body></html>") And the unit tests, import unittest from webtest import TestApp from google.appengine.ext import webapp from email import EmailHandler class SendingEmails(unittest.TestCase): def setUp(self): self.application = webapp.WSGIApplication([('/', EmailHandler)], debug=True) def test_success(self): app = TestApp(self.application) response = app.get('http://localhost:8080/[email protected]&body=blah_blah_blah&subject=mySubject') self.assertEqual('200 OK', response.status) self.assertTrue('success' in response) # somehow, assert email was sent

    Read the article

  • Flex 3: should I provide prepared data to my component or make it to process data before display?

    - by grapkulec
    I'm starting to learn a little Flex just for fun and maybe to prove that I still can learn something new :) I have some idea for a project and one of its parts is a tree component which could display data in different ways depending on configuration. The idea There is list of objects having properties like id, date, time, name, description. And sometimes list should be displayed like this: first level: date second level: time third level: name and sometimes like this: first level: year second level: month third level: day fourth level: time and name By level I mean level of nesting of course. So, we can have years, that have months, that have days, that have hours and so forth. The problem What could be the best way to do it? I mean, should I prepare data for different ways of nesting outside of component or even outside of flex? I can do it at web service level in C# where I plan to have database access layer and send to flex nice and ready to display XML or array of objects. But I wonder if that won't cause additional and maybe unneccessary network traffic. I tried to hack some code in my component to convert my data objects into XML or ArrayCollection but I don't know enough of Flex and got stuck on elimination of duplicates or getting specific data by some key value. Usually to do such things I have STL with maps, sets and vectors and I find Flex arrays and even Dictionary a little bit confusing (I've read language reference and googled without any significant luck). The question So, to sum things up: should I give my tree component data prepared just for chosen type of display or should I try to do it internally inside component (or some helper class written in ActionScript)?

    Read the article

  • Mock Object Data

    - by Nissan Fan
    I'd like to mock up object data, not the objects themselves. In other words, I would like to generate a collection of n objects and pass it into a function which generates random data strings and numbers. Is there anything to do this? Think of it as a Lorem Ipsum for object data. Constraints around numerical ranges etc. are not necessary, but would be a bonus.

    Read the article

  • Google App Engine - Caching generated HTML

    - by Alexander
    I have written a Google App Engine application that programatically generates a bunch of HTML code that is really the same output for each user who logs into my system, and I know that this is going to be in-efficient when the code goes into production. So, I am trying to figure out the best way to cache the generated pages. The most probable option is to generate the pages and write them into the database, and then check the time of the database put operation for a given page against the time that the code was last updated. Then, if the code is newer than the last put to the database (for a particular HTML request), new HTML will be generated and served, and cached to the database. If the code is older than the last put to the database, then I will just get the HTML direct from the database and serve it (therefore avoiding all the CPU wastage of generating the HTML). I am not only looking to minimize load times, but to minimize CPU usage. However, one issue that I am having is that I can't figure out how to programatically check when the version of code uploaded to the app engine was updated. I am open to any suggestions on this approach, or other approaches for caching generated html. Note that while memcache could help in this situation, I believe that it is not the final solution since I really only need to re-generate html when the code is updated (as opposed to every time the memcache expires). Kind Regards, and thank you in advance for any suggestions you may be able to offer. -Alex

    Read the article

< Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >