I am planning on making a Java Swing application and was wondering if Swing is still used or if it has been replaced with something else.
Thanks in advance!
I am gonna develop a simple Content Management Tools for a website. It need to support news update and some general announce. We are planning to use framework and written in PHP. So can someone suggest PHP framework? Thx.
I am planning to learn JSON with .net looking for good book. I know we can get lot of online resource which i am refering to. But i would like to read the book that way i can will cover all the topics and know more on JSON.
My custom callback class implements AsyncCallback
(like MyAsyncCallback implements AsyncCallback) and planning use single instance of MyAsyncCallback for multiple rpc method executions. Is this approach safe?. Or should have to create new instance of MyAsyncCallback for every interaction from browser to server?.
I am kind of tired of seeing so many anonymous AsyncCallback code blocks.
Thanks for your input
i'm an absolute newbie to webapps, so please forgive if my question is quite naive or not upto the mark.i have been assigned a task to develop a web app which integrates the twitter pull request with another application using the particular application APIs.basically i dont know what it means.Can anyone explain what is intended to be done and if there is any example of such web apps if any,it will be most appreciated, so that i can get an idea of what is to be actually done.i'm planning to do this using ruby
I have a question about how to structure communication between a (new) Firefox extension and existing C# code.
The firefox extension will use configuration data and will produce other data, so needs to get the config data from somewhere and save it's output somewhere. The data is produced/consumed by existing C# code, so I need to decide how the extension should interact with the C# code.
Some pertinent factors:
It's only running on windows, in a relatively controlled corporate environment.
I have a windows service running on the machine, built in C#.
Storing the data in a local datastore (like sqlite) would be useful for other reasons.
The volume of data is low, e.g. 10kb of uncompressed xml every few minutes, and isn't very 'chatty'.
The data exchange can be asynchronous for the most part if not completely.
As with all projects, I have limited resources so want an option that's relatively easy.
It doesn't have to be ultra-high performance, but shouldn't add significant overhead.
I'm planning on building the extension in javascript (although could be convinced otherwise if really necessary)
Some options I'm considering:
use an XPCOM to .NET/COM bridge
use a sqlite db: the extension would read from and save to it. The c# code would run in the service, populating the db and then processing data created by the service.
use TCP sockets to communicate between the extension and the service. Let the service manage a local data store.
My problem with (1) is I think this will be tricky and not so easy. But I could be completely wrong? The main problem I see with (2) is the locking of sqlite: only a single process can write data at a time so there'd be some blocking. However, it would be nice generally to have a local datastore so this is an attractive option if the performance impact isn't too great. I don't know whether (3) would be particularly easy or hard ... or what approach to take on the protocol: something custom or http.
Any comments on these ideas or other suggestions?
UPDATE: I was planning on building the extension in javascript rather than c++
I'm planning on provisioning a web server and database server in a server farm environment. They will be in the same network but not in the same domain, both windows server 2008 and the database server is sql server 2008. My question being, what is the best way to secure data in transport between the servers? I've looked into IPSEC and SSL but not sure how to go about implementing either.
I am planning for an application of time-table generation. I wanted the data such as standardname, teachername, standard wise subjects and also relate the corresponding teacher with the subject and standard.
I had think like,
StandardMaster(stdid,stdname)
TeacherMaster(teacherid,teachername)
SubjectMaster(subid,subname,stdid)
What more I need? Any corrections needed in these three tables?
I was under the impression that the open id for a user remains constant.
I am allowing users to setup open id with my site with 2 different screens in my app...
I was under the impression that the open id for a user is constant and will not change and I was planning to save it to my database for a given user,
However, to my suprise, I found different ids for the same user using the same google account in the two different screens.
How does open id actually work ?
I am planning to use serialization to do clone. I have to make my class ISerializable. But how about its super classes and all referenced variable classes? do I need to make them all ISerializable?
If I use ISerializable. I have to implement the GetObjectData(), what I should put inside that method? leave it empty is ok?
I've always been fascinated by microcontrollers and I'm planning to do a few hobby projects just to satisfy my inner geek :)
I'm looking for ideas and motivation, so what did you develop using a microcontroller?
If possible please state the microcontroller and/or development environment and an estimate on hardware costs beyond the basic equipment (if applicable).
I'm interested in both successful and failed projects and any problems you encountered.
I'm planning on releasing two versions of my Android apps:
Free with ads
Paid without
What is the recommended way of handing this while leaving the database compatible with both versions?
The college I am currently studying gives a degree of Computing (Not Computer Engineering or IT just Computing) and I am afraid I have made a wrong decision. I am planning on studying Artificial Intelligence for Masters and I am not sure whether my qualification will be enough for this major.
Hi,
I am planning to have something like this for a website that is on Ruby on Rails. User comes and enters a bunch of names in a text field, and a queue gets created from all the names. From there the website keeps asking more details for each one from the queue until the queue finishes.
Is there any queue management gem available in Ruby or I have to just create an array and keep incrementing the index in session variable to emulate a queue behaviour?
As far as I know the Websphere Application Server contains a Tomcat instance as servlet container.
I'd like to know which version of Tomcat this is in the Websphere 7.0 Version, since we are planning to use Tomcat for development, but WAS for deployment. Obviously we would like to use the matching version.
I have read lots of programmers saying and writing when programming in C/C++ there are lots of issue related to memory. I am planning to learn to program in C/C++. I have beginner knowledge of C/C++ and I want to see some short sample why C/C++ can have issues with memory management. Please Provide some samples.
I am planning on storing scans from a mass spectrometer in a MySQL database and
would like to know whether storing and analyzing this amount of data is remotely
feasible. I know performance varies wildly depending on the environment, but I'm
looking for the rough order of magnitude: will queries take 5 days or 5
milliseconds?
Input format
Each input file contains a single run of the spectrometer; each run is comprised
of a set of scans, and each scan has an ordered array of datapoints. There is a
bit of metadata, but the majority of the file is comprised of arrays 32- or
64-bit ints or floats.
Host system
|----------------+-------------------------------|
| OS | Windows 2008 64-bit |
| MySQL version | 5.5.24 (x86_64) |
| CPU | 2x Xeon E5420 (8 cores total) |
| RAM | 8GB |
| SSD filesystem | 500 GiB |
| HDD RAID | 12 TiB |
|----------------+-------------------------------|
There are some other services running on the server using negligible processor
time.
File statistics
|------------------+--------------|
| number of files | ~16,000 |
| total size | 1.3 TiB |
| min size | 0 bytes |
| max size | 12 GiB |
| mean | 800 MiB |
| median | 500 MiB |
| total datapoints | ~200 billion |
|------------------+--------------|
The total number of datapoints is a very rough estimate.
Proposed schema
I'm planning on doing things "right" (i.e. normalizing the data like crazy) and
so would have a runs table, a spectra table with a foreign key to runs,
and a datapoints table with a foreign key to spectra.
The 200 Billion datapoint question
I am going to be analyzing across multiple spectra and possibly even multiple
runs, resulting in queries which could touch millions of rows. Assuming I index
everything properly (which is a topic for another question) and am not trying to
shuffle hundreds of MiB across the network, is it remotely plausible for MySQL
to handle this?
UPDATE: additional info
The scan data will be coming from files in the XML-based
mzML format. The meat of this format is in the
<binaryDataArrayList> elements where the data is stored. Each scan produces =
2 <binaryDataArray> elements which, taken together, form a 2-dimensional (or
more) array of the form [[123.456, 234.567, ...], ...].
These data are write-once, so update performance and transaction safety are not
concerns.
My naïve plan for a database schema is:
runs table
| column name | type |
|-------------+-------------|
| id | PRIMARY KEY |
| start_time | TIMESTAMP |
| name | VARCHAR |
|-------------+-------------|
spectra table
| column name | type |
|----------------+-------------|
| id | PRIMARY KEY |
| name | VARCHAR |
| index | INT |
| spectrum_type | INT |
| representation | INT |
| run_id | FOREIGN KEY |
|----------------+-------------|
datapoints table
| column name | type |
|-------------+-------------|
| id | PRIMARY KEY |
| spectrum_id | FOREIGN KEY |
| mz | DOUBLE |
| num_counts | DOUBLE |
| index | INT |
|-------------+-------------|
Is this reasonable?
Hi all,
I have developed a mac application, which is continuously interacting with database on linux server. As data is increasing it has become costlier affair in terms of time to fetch data from server. So I am planning to store required data locally, say on mysqlite and find some mechanism through which I can synchronize it with database on linux server.
Can anyone suggest me some way to accomplish it?
Thanks,
Miraaj
I am going to use forms authentication but I want to be able to link the asp.net users to some tables in the db for example
If I have a class and students (as roles) I'll have a class students table.
I'm planning to put in a Users table containing a simple int userid and ASP.NET username in there and put userid wherever I want to link the users.
Does that sound good? any other options of modeling this? it does sound a bit convoluted?
I am planning to build an inverted index searching system with cassandra as its storage backend. But I need some guidances to build a highly efficient searching daemon server. I know a web server written in Python called tornado, my questions are:
Is Python a good choice for developing such kind of app?
Is Nginx(or Sphinx) a good example that I can look inside to learn its architecture to implement a highly efficient server?
Anything else I should learn to do this?
Thank you~
I am planning to make a railway reservation project...
I am maintaining following tables..
trainTable
(trainId,trainName,trainFrom,trainTo,trainDate,trainNoOfBoogies)...PK(trainId)
Boogie
(trainId,boogieId,boogieName,boogieNoOfseats)...CompositeKey(trainId,boogieId)...
Seats
(trainId,boogieId,seatId,seatStatus,seatType)...CompositeKey(trainId,boogieId,seatId)...
user
(userId,name...personal details)
userBooking
(userId,trainId,boogieId,seatId)...Is this good design
reply me please...
Soon I'll be launching my new site and i was planning on using gmail as the email server for things like registration and lost password.
I'll be using google apps (free version) so I can have [email protected].
Besides the 500/day limit are there any other potential problems with using gmail for this service?
I'm planning to use Autofac IoC for my project where I must implement auditing (Who, What is doing in application). I was already read a lot of articles on this subject (auditing).
My intention was to use method interception to implement this functionality.
I know that Unity support this, but I was wondering if I can use Autofac for this scenario ?
I was wondering if there's an implementation of a XML schema reader that for an arbitrary node in XML schema provides list of nodes which are supposed to be present as child nodes of given node, restrictions on nodes and so on.
I'm planning to program it for my purposes but I would like to know if it isn't solved somewhere. I really need only a small subset what I described above.
Thanks for tips!
How do you figure out the current size of the sharepoint web application? Better yet, the size of a site collection or a subsite.
I am planning to move a site collection from one farm to another. I need to plan the storage capacity first.