Search Results

Search found 10751 results on 431 pages for 'day cq'.

Page 300/431 | < Previous Page | 296 297 298 299 300 301 302 303 304 305 306 307  | Next Page >

  • how to aggregate this data in R

    - by stevejb
    Hello, I have a data frame in R with the following structure. > testData date exch.code comm.code oi 1 1997-12-30 CBT 1 468710 2 1997-12-23 CBT 1 457165 3 1997-12-19 CBT 1 461520 4 1997-12-16 CBT 1 444190 5 1997-12-09 CBT 1 446190 6 1997-12-02 CBT 1 443085 .... 77827 2004-10-26 NYME 967 10038 77828 2004-10-19 NYME 967 9910 77829 2004-10-12 NYME 967 10195 77830 2004-09-28 NYME 967 9970 77831 2004-08-31 NYME 967 9155 77832 2004-08-24 NYME 967 8655 What I want to do is produce a table the shows for a given date and commodity the total oi across every exchange code. So, the rows would be made up of unique(testData$date) and the columns would be unique(testData$comm.code) and each cell would be the total oi over all exch.codes on a given day. Thanks,

    Read the article

  • Database table design vs. ease of use.

    - by Gastoni
    I have a table with 3 fields: color, fruit, date. I can pick 1 fruit and 1 color, but I can do this only once each day. examples: red, apple, monday red, mango, monday blue, apple, monday blue, mango, monday red, apple, tuesday The two ways in which I could build the table are: 1.- To have color, fruit and date be a composite primary key (PK). This makes it easy to insert data into the table because all the validation needed is done by the database. PK color PK fruit PK date 2.- Have and id column set as PK and then all the other fields. Many say thats the way it should be, because composite PKs are evil. For example, CakePHP does no support them. PK id color fruit date Both have advantages. Which would be the 'better' approach?

    Read the article

  • precise geolocalization via IP

    - by meo
    I tied the iPad the other day, and was amazed about the precision of the geolocalization by ip. Actually there is this action against hunger in the world that shows you very precisely where the persons are located that have took part to this petition: http://www.1billionhungry.org/meodai/impact/ I would like to integrate that in one of my projects. I took a look at the source but i could not figure out how they did it. Can someone help me out? is there a web service for that? Is the google map api doing this or are they using an other service? PS: Its not just the country/region of your IP/ISP IP that the service gives back, its a pretty precise positioning.

    Read the article

  • How many pages can i add to a new website without going to the google sandbox ?

    - by François
    Hello there, I'm about to open a new website wich have 15.000 pages (ecommerce store). Of course, i will not publish all this pages at the same time, but i'm looking for infos on how much pages should i start with without going to the sandbox. For example, could i start with a 50 pages website or is it too much ? Then, have you an idea (i know there's no precises rules here) what's the frenquency / volume of pages i could add later ? (50 pages / a day is it ok ?) Thank you very much for your advice, and sorry for my english :-)

    Read the article

  • What's the best way to write a maintainable web scraping app?

    - by Benj
    I wrote a perl script a while ago which logged into my online banking and emailed me my balance and a mini-statement every day. I found it very useful for keeping track of my finances. The only problem is that I wrote it just using perl and curl and it was quite complicated and hard to maintain. After a few instances of my bank changing their webpage I got fed up of debugging it to keep it up to date. So what's the best way of writing such a program in such a way that it's easy to maintain? I'd like to write a nice well engineered version in either Perl or Java which will be easy to update when the bank inevitably fiddle with their web site.

    Read the article

  • penetration of web 2.0 features amongst users?

    - by user151841
    I have a survey web-app that is public facing. I want to set up automated testing with Selenium, but selenium can't capture javascript alerts that we're currently using on the site. I'm thinking about changing our user-facing error notifications to some web 2.0 javascript library so that it is accessible to Selenium. However, I'm not sure how many of our users would be able to experience them properly. How backwards-compatible do I need to be in the present day? I have collected a database of actual user-agent strings of our users. I asked here how I could group them into meaningful data about what browsers our users are actually using.

    Read the article

  • Best ASP.NET Background Service Implementation

    - by Jason N. Gaylord
    What's the best implementation for more than one background service in an ASP.NET application? Timer Callback Timer timer = new Timer(new TimerCallback(MyWorkCallback), HttpContext, 5000, 5000); Thread or ThreadPool Thread thread = new Thread(Work); thread.IsBackground = true; thread.Start(); BackgroundWorker BackgroundWorker worker = new BackgroundWorker(); worker.DoWork += new DoWorkEventHandler(DoMyWork); worker.RunWorkerCompleted += new RunWorkerCompletedEventHandler(DoMyWork_Completed); worker.RunWorkerAsync(); Caching like http://www.codeproject.com/KB/aspnet/ASPNETService.aspx (located in Jeff Atwood's post here) I need to run multiple background "services" at a given time. One service may run every 5 minutes where another may be once a day. It will never be more than 10 services running at a time.

    Read the article

  • How do i get the default gateway in LINUX given the destination?

    - by Suezy
    Good day! I'm trying to get the default gateway, using the destination 0.0.0.0 i used this command: netstat -rn | grep 0.0.0.0 and it returns this list: Destination Gateway Genmask Flags MSS Window irtt Iface 10.9.9.17 0.0.0.0 255.255.255.255 UH 0 0 0 tun0 133.88.0.0 0.0.0.0 255.255.0.0 U 0 0 0 eth0 0.0.0.0 133.88.31.70 0.0.0.0 UG 0 0 0 eth0 My goal here is to ping the default gateway using destination 0.0.0.0; thus, that is "133.88.31.70"; but this one returns a list because of using 'grep'. Question is: How do i get the default gateway only? I will need it for my bash script to identify if net connection is up or not. Any answers will be much appreciated. =)

    Read the article

  • Can't register credit card with Microsoft Windows Azure

    - by user1083268
    I get the following error when trying to sign up for the Microsoft Azure 90-day free trial: We can't authorize the payment method. Please make sure the information is correct, or use another payment method. If you continue to get this message, please contact your financial institution. I've tried three different cards, two credit and one debit. Those cards are issued from two different banks. I've also tried the cards on two separate accounts. Someone from my work also confirmed that he could not sign up for the free trial either. Has anyone else had this problem? I haven't really seen much help searching Google and the support staff doesn't seem interested in helping people sign up for free accounts.

    Read the article

  • making sure "expiration_date - X" falls on a valid "date_of_price" (if not, use the next valid date_

    - by bobbyh
    I have two tables. The first table has two columns: ID and date_of_price. The date_of_price field skips weekend days and bank holidays when markets are closed. table: trading_dates ID date_of_price 1 8/7/2008 2 8/8/2008 3 8/11/2008 4 8/12/2008 The second table also has two columns: ID and expiration_date. The expiration_date field is the one day in each month when options expire. table: expiration_dates ID expiration_date 1 9/20/2008 2 10/18/2008 3 11/22/2008 I would like to do a query that subtracts a certain number of days from the expiration dates, with the caveat that the resulting date must be a valid date_of_price. If not, then the result should be the next valid date_of_price. For instance, say we are subtracting 41 days from the expiration_date. 41 days prior to 9/20/2008 is 8/10/2008. Since 8/10/2008 is not a valid date_of_price, we have to skip 8/10/2008. The query should return 8/11/2008, which is the next valid date_of_price. Any advice would be appreciated! :-)

    Read the article

  • After HTTP GET request, the resulting string is cut-off (incomplete)

    - by Jayomat
    hi all, I'm making a http get request like this: try { HttpClient client = new DefaultHttpClient(); String getURL = "http://busspur02.aseag.de/bs.exe?SID=5FC39&ScreenX=1440&ScreenY=900&CMD=CR&Karten=true&DatumT="+day+"&DatumM="+month+"&DatumJ="+year+"&ZeitH="+hour+"&ZeitM="+min+"&Intervall=60&Suchen=(S)uchen&GT0=Aachen&T0=H&HT0="+start_from+"&GT1=Aachen&T0=H&HT1="+destination+""; HttpGet get = new HttpGet(getURL); HttpResponse responseGet = client.execute(get); HttpEntity resEntityGet = responseGet.getEntity(); if (resEntityGet != null) { //do something with the response Log.i("GET RESPONSE",EntityUtils.toString(resEntityGet)); } ........ It all works well... the only problem: the output from Log.i is cut-off... It's not the complete html page. If I make the same request in a browser, I get 3x the output in opposition to making the request in the emulator and using the above code.... what's wrong?

    Read the article

  • How to discard time intervals with Time Series / XYPlots using JFreeChart?

    - by Alex Arnon
    Hi All, I am building a set of chart displays, one of which is for a month display of daily trading - that is, one point of data per day (closing). Since there is no trade during weekends and holidays, I need to discard these data points. Not only that, but data points should still appear adjacent to each other, regardless of any gaps in time. This can be seen in any such chart e.g. in the 3 month graph for Nasdaq on Yahoo Finance - see how weekends are skipped. My question is: how should one correctly implement this in JFreeChart? Thanks in advance!

    Read the article

  • Getting users latest tweet with Django

    - by Hanpan
    I want to create a function which grabs every users latest tweet from a specific group. So, if a user is in the 'authors' group, I want to grab their latest tweet and then finally cache the result for the day so we only do the crazy leg work once. def latest_tweets(self): g = Group.objects.get(name='author') users = [] for u in g.user_set.all(): acc = u.get_profile().twitter_account users.append('http://twitter.com/statuses/user_timeline/'+acc+'.rss') return users Is where I am at so far, but I'm at a complete loose end as to how I parse the RSS to get there latest tweet. Can anyone help me out here? If there is a better way to do this, any suggestions are welcome! I'm sure someone will suggest using django-twitter or other such libraries, but I'd like to do this manually if possible. Cheers

    Read the article

  • C#: reflection alternative for switch on enum in order to select namespace/class

    - by Am
    Hi, I have an interface named IHarvester. There are 3 implementations of that interface, each under their own namespace: Google Yahoo Bing A HarvesterManager uses the given harvester. It knows the interface and all 3 implementations. I want some way of letting the class user say in which harvester it wants to use. And in the code select that implementation, without a switch-case implementation. Can reflection save my day? Here is the code bits: // repeat for each harvester namespace Harvester.Google { public abstract class Fetcher : BaseHarvester, IInfoHarvester {...} } public enum HarvestingSource { Google, Yahoo, Bing, } class HarvesterManager { public HarvestingSource PreferedSource {get;set;} public HarvestSomthing() { switch (PreferedSource) .... // awful... } } Thanks.

    Read the article

  • MYSQL - multiple count statments

    - by darudude
    I'm trying to do a lookup on our demographioc table to display some stats. However, since out demographic table is quit big I want to do it in one query. There are 2 fields that are important: sex, last_login I want to be able to get the total number of logins for various date ranges (<1day ago, 1-7 days ago, 7-30 days ago, etc) GROUPED BY sex I right now know how to do it for one date range. For example less than 1 day ago: SELECT sex, count(*) peeps FROM player_account_demo WHERE last_demo_update>1275868800 GROUP BY sex Which returns: sex peeps UNKNOWN 22 MALE 43 FEMALE 86 However I'd have to do this once for each range. Is there a way to get all 3 ranges in there? I'd want my end result to look something like this: sex peeps<1day peeps1-7days peeps7-30days Thanks!

    Read the article

  • Batch backup a harddrive without modifying access times C#

    - by johnathan-doena
    I'm trying to write a simple program that will backup my flash drive. I want it to work automatically and silently in the background, and I also want it to be as quick as possible. The thing is, resetting all the access times is useless to me, and something I want to avoid. I know I can read the access times and set them back, but I bet it will fail one day in the future. It would be much simpler to read the files without ever changing it. Also, what is the fastest way to do this? What differences would there be between, say, a flash drive and an external hard drive. I am writing this in C#, as it is the simplest way to do it and it will probably last more generations of Windows..

    Read the article

  • How can I implement a jquery ajax form which requests information from a web api via a php request?

    - by Jacob Schweitzer
    I'm trying to implement a simple api request to the SEOmoz linkscape api. It works if I only use the php file but I want to do an ajax request using jquery to make it faster and more user friendly. Here is the javascript code I'm trying to use: $(document).ready(function(){ $('#submit').click(function() { var url=$('#url').val(); $.ajax({ type: "POST", url: "api_sample.php", data: url, cache: false, success: function(html){ $("#results").append(html); } }); }); }); And here is the part of the php file where it takes the value: $objectURL = $_POST['url']; I've been working on it all day and can't seem to find the answer... I know the php code works as it returns a valid json object and displays correctly when I do it that way. I just can't get the ajax to show anything at all!!

    Read the article

  • Windows Workflow runs very slowlyh on my DEV machine

    - by Joon
    I am developing an app using WF hosted in IIS as WCF services as a business layer. This runs quickly on any machine running Windows Server 2008 R2, but very slowly on our dev machines, running Windows XP SP3. Yesterday, the workflows were as fast on my dev machine as they are on the server for the whole day. Today, they are back to running slowly again (I rebooted overnight) Has anyone else experienced this problem with workflows running slowly on IIS in XP? What did you do to fix it?

    Read the article

  • Does a SELECT happen all at once, or progressively

    - by AmbroseChapel
    I have a process which finds a list of files to be deleted using a SELECT wheredelete= 'Y'. I set this process running the other day but it takes a while because it actually does the file deletions too. And in the middle of its long operation, I was using the application and deleted one more file. At this point I realised I didn't know if that file would be deleted, because I didn't know if the SELECT would have found all the files at the start, or if it was finding them progressively and would get to my newly-deleted file eventually.

    Read the article

  • Adding relative week number column to MySQl results

    - by Anthony
    I have a table with 3 columns: user, value, and date. The main query returns the values for a specific user based on a date range: SELECT date, value FROM values WHERE user = '$user' AND date BETWEEN $start AND $end What I would like is for the results to also have a column indicating the week number relative to the date range. So if the date range is 1/1/2010 - 1/20/2010, then any results from the first Sun - Sat of that range are week 1, the next Sun - Sat are week 2, etc. If the date range starts on a Saturday, then only results from that one day would be week 1. If the date range starts on Thursday but the first result is on the following Monday, it would be week 2, and there are no week 1 results. Is this something fairly simple to add to the query? The only ideas I can come up with would be based on the week number for the year or the week number based on the results themselves (where in that second example above, the first result always gets week 1).

    Read the article

  • Multiple RadUpload Control in One Page

    - by user159771
    I have an aspx page that uses master page. In the papx page, I load a user control containing 2 RadUpload controls (Rad1 and Rad2). User can choose to upload the file either using the first RadUpload or the second RadUpload and there is certain validation applied for each RadUpload. The weird thing happened is that when I upload file using Rad2 (second RadUpload), the RadUpload.UploadedFiles for the first RadUpload is there (count = 1). Instead of the file being uploaded by Rad2, it is detected as if it is uploaded from Rad1, so my validation failed. Does someone encounter this problem before? This is a very weird thing and I've spent almost 1 and a half day fixing this without knowing what happened to the control

    Read the article

  • Loop through multiple tables to execute same query

    - by pcvnes
    Hi, I have a database wherein per day a table is created to log process instances. The tables are labeled MESSAGE_LOG_YYYYMMDD Currently i want to sequentially execute the same QUERY against all those tables. I wrote the PL/SQL below, but got stuck at line 10. How can i execute the SQL statement against successfully against all tables here ? DECLARE CURSOR all_tables IS SELECT table_name FROM all_tables WHERE TABLE_NAME like 'MESSAGE_LOG_2%' ORDER BY TABLE_NAME ; BEGIN FOR msglog IN all_tables LOOP SELECT count(*) FROM TABLE msglog.TABLE_NAME ; END LOOP; END; / Cheers, Peter

    Read the article

  • Cookies ignored on Apache/Magento installation

    - by Laizer
    I'm running a Magento based website store on Linux/Apache. In order that user logins are maintained, I've set my cookie lifetime to be close to two years. The cookies are sent out with the right times, I can see them in my browsers. When I visit the site from a previously logged-in browser after about a day, the user is logged out. I can still see the cookies, with their extended life, present in the browser. Where should I start looking to get to the bottom of this issue?

    Read the article

  • Interesting Row_Number() bug

    - by Joel Coehoorn
    I was playing with the Stack Exchange Data Explorer and ran this query: http://odata.stackexchange.com/stackoverflow/q/2828/rising-stars-top-50-users-ordered-on-rep-per-day Notice down in the results, rows 11 and 12 have the same value and so are mis-numbered, even though the row_number() function takes the same order by parameter as the query. I know the correct fix here is to specify an additional tie-breaker column in the order by clauses, but I'm more curious as to why/how the row_number() function returned different results on the same data? If it makes a difference anywhere, this runs on Azure.

    Read the article

  • Is recursion preferred compare to iteration in multicore era?

    - by prM
    Or say, do multicore CPUs process recursion faster than iteration? Or it simply depends on how one language runs on the machine? like c executes function calls with large cost, comparing to doing simple iterations. I had this question because one day I told one of my friend that recursion isn't any amazing magic that can speed up programs, and he told me that with multicore CPUs recursion can be faster than iteration. EDIT: If we consider the most recursion-loved situation (data structure, function call), is it even possible for recursion to be faster?

    Read the article

< Previous Page | 296 297 298 299 300 301 302 303 304 305 306 307  | Next Page >