Search Results

Search found 34962 results on 1399 pages for 'drop database'.

Page 303/1399 | < Previous Page | 299 300 301 302 303 304 305 306 307 308 309 310  | Next Page >

  • Methods to achieve first, previous, next and last navigation

    - by Koroviev
    I'm new to programming and have a question about navigation. What are the various methods that can achieve a "First", "Previous", "Next" and "Last" navigation, as seen in article or comic based sites? (Admittedly the "First" link isn't confusing as it can stay static, but what about the others?) For example, on the 50th page, the links would appropriately lead to the 49th and 51st (if it exists, if not it would not function but would automatically become active when such a page exists. In most examples of this I see urls ending with something like ".php?id=50" but am not certain how it's achieved, with a database? Any help will be very much appreciated, thanks.

    Read the article

  • algorithm advice for finding maximum items within a time period

    - by darren
    Hi everyone. I have a database schema that is similar to the following: | User | Event | Date |--------|---------------|------ | 111 | Walked dog | 2009-10-1 | 222 | Walked dog | 2009-10-2 | 333 | Fed Fish | 2009-10-5 | 222 | Did Laundry | 2009-10-6 | 111 | Fed Fish | 2009-10-7 | 111 | Walked dog | 2009-10-18 | 222 | Walked dog | 2009-10-19 | 111 | Fed Fish | 2009-10-21 I would like to produce a query that returns the maximum number of times a user performs some action within a time period. For example, given a time period of 5 days, what is the maximum number of times user 111 walked the dog? The most obvious solution would be to start at some zero point and move forward each day, summing up 5 day periods along the way, then taking the maximum total out of all the 5 day windows. the approach seems incredibly costly however. I would appreciate any suggestions you may have.

    Read the article

  • Joining links together in a dictionary

    - by ptabatt
    Hi guys, I'm student here, new to python and programming in general. I have a dictionary links which holds a tuple mapped to a number. How can I join the second url in the second tuple together with the urljoin() function? What I'm trying to do is get complete links so I can run a recursive function search() which takes a complete url as an arguement, finds all the links in each url and stores the number of links mapped to the links in a database. So far, I have: links {('href', 'http://reed.cs.depaul.edu/lperkovic/csc242/test2.html'): 1, ('href', 'test3.html'): 1} I want http://reed.cs.depaul.edu/lperkovic/csc242/test3.html...

    Read the article

  • Search SQL Question Between Related Two Tables

    - by mTuran
    Hi, I am writing some kind of search engine for my web application and i have a problem. I have 2 tables first of is projects table: PROJECTS TABLE id int(11) NO PRI NULL auto_increment employer_id int(11) NO MUL NULL project_title varchar(100) NO MUL NULL project_description text NO NULL project_budget int(11) NO NULL project_allowedtime int(11) NO NULL project_deadline datetime NO NULL total_bids int(11) NO NULL average_bid int(11) NO NULL created datetime NO MUL NULL active tinyint(1) NO MUL NULL PROJECTS_SKILLS TABLE project_id int(11) NO MUL NULL skill_id int(11) NO MUL NULL For example: I want ask this query to database: 1-) Skills are 5 and 7. 2-) Order results by created 3-) project title contains "php" word. 4-) Returned rows should contain projects.* columuns. 5-) Projects should be distinct(i don't want same projects in return of query). Please write sql query that ensure these conditions. Thank You.

    Read the article

  • Morfik - suitability for medium-scale web enterprise applications

    - by MaikB
    I'm investigating technologies with which to develop a medium-scale (up to 100 or 200 simultaneous users) database-driven web application, and someone suggested Morfik. However, outside of the Morfik company I can find practically zero community support - no active blogs, no tutorials, no videos, no books - and this is of some concern (especially when compared to C# / ASP.NET / nHibernate etc support). Deciding between Morfik (untried and not used widely AFAIK) and the other technologies I mentioned (tried, tested, used widely) is becoming a critical issue for my company. Has anyone had success using Morfik in these kind of circumstances? What kind of performance did you achieve?

    Read the article

  • Correct way to safely store token/secret/etc from OAuth?

    - by viatropos
    I just started looking into OAuth and it looks really nice. I have oauth with twitter working in ruby right now. Now I'm wondering, what is the recommended safe way to store the responses in my local database and session? What should I store? Where should I store it? This example twitter-oauth-with-rails app stores a user.id in the session, and the user table has the token and secret. But that seems like it'd be really easy to hack and get the secret by just passing in a slew of test user ids, no?

    Read the article

  • Rails 3: Validate combined values

    - by Cimm
    In Rails 2.x you can use validations to make sure you have a unique combined value like this: validates_uniqueness_of :husband, :scope => :wife In the corresponding migration it could look like this: add_index :family, [:husband, :wife], :unique => true This would make sure the husband/wife combination is unique in the database. Now, in Rails 3 the validation syntax changed and the scope attribute seems to be gone. It now looks like: validates :husband, :presence => true Any idea how I can achieve the combined validation in Rails 3? The Rails 2.x validations still work in Rails 3 so I can still use the first example but it looks so "old", are there better ways?

    Read the article

  • Testing a (big) collection retrieved from a db

    - by Bas
    I'm currently doing integration testing on a live database and I have the following sql statement: var date = DateTime.Parse("01-01-2010 20:30:00"); var result = datacontext.Repository<IObject>().Where(r => r.DateTime > date).First(); Assert.IsFalse(result.Finished); I need to test if the results retrieved from the statement, where the given date is less then the date of the object, have Finished set to False. I do not know how many results I get back and currently I'm getting the first object of the list and check if that object has Finished set to false. I know testing only the first item of the list is not valid testing, as a solution for that I could iterate through the list and check all items on Finished, but putting logic in a Test is kinda going against the concept of writing 'good' tests. So my question is: Does anyone have a good solution of how to properly test the results of this list?

    Read the article

  • Silverlight Isolated Storage and loading big files

    - by Thomas Joulin
    In a Windows Phone 7 application, I would like to query a big XML file (list of cities) stored using Isolated Storage. If I do that this way, will the file be loaded to memory ( 5 mo) ? If so, what other solution do I have? Edit: More details. I want to use AutoCompleteBox (http://www.jeff.wilcox.name/2008/10/introducing-autocompletebox/), but instead of using a web service (this is fixed data, no need to be online), I want to query a file/database/isolated storage... I have a fixed list of cities. I said in the comments it's 40k, but it finally seems closer to 1k rows.

    Read the article

  • PHP and MS Access: Number of Records returned by SELECT query

    - by VarunGupta
    I am running following PHP code to interact with a MS Access database. $odbc_con = new COM("ADODB.Connection"); $constr = "DRIVER={Microsoft Access Driver (*.mdb)}; DBQ=" . $db_path . ";"; $odbc_con -> open($constr); $rs_select = $odbc_con -> execute ("SELECT * FROM Main"); Using ($rs_select - RecordCount) gives -1 though the query is returning non-zero records. (a) What can be the reason? (b) Is there any way out? I have also tried using count($rs_select -> GetRows()). This satisfies the need but looks inefficient as it will involve copying of all the records into an array first.

    Read the article

  • Copying data from STDOUT to a remote machine using SFTP

    - by freddie
    In order to backup large database partitions to a remote machine using SFTP, I'd like to use the databases dump command and send it directly over using SFTP to a remote location. This is useful when needing to dump large data sets when you don't have enough local disk space to create the backup file, and then copy it to a remote location. I've tried using python + paramiko which provides this functionality, but the performance much worse than using the native openssh/sftp binary to transfer files. Does anyone have any idea on how to do this either with the native sftp client on linux, or some library like paramiko? (but one that performs close to the native sftp client)?

    Read the article

  • PHP or C# script to parse CSV table values to fill in one-to-many table

    - by Yaaqov
    I'm looking for an example of how to split-out comma-delimited data in a field of one table, and fill in a second table with those individual elements, in order to make a one-to-many relational database schema. This is probably really simple, but let me give an example: I'll start with everything in one table, Widgets, which has a "state" field to contain states that have that widget: Table: WIDGET =============================== | id | unit | states | =============================== |1 | abc | AL,AK,CA | ------------------------------- |2 | lmn | VA,NC,SC,GA,FL | ------------------------------- |3 | xyz | KY | =============================== Now, what I'd like to create via code is a second table to be joined to WIDGET called *Widget_ST* that has widget id, widget state id, and widget state name fields, for example Table: WIDGET_ST ============================== | w_id | w_st_id | w_st_name | ------------------------------ |1 | 1 | AL | |1 | 2 | AK | |1 | 3 | CA | |2 | 1 | VA | |2 | 2 | NC | |2 | 1 | SC | |2 | 2 | GA | |2 | 1 | FL | |3 | 1 | KY | ============================== I am learning C# and PHP, so responses in either language would be great. Thanks.

    Read the article

  • What programming language is used to design google algorithm?

    - by AKN
    It is known that google has best searching & indexing algorithm. The also have good relevancy. They are also quicker in getting down the latest results. All that's fine. What programming language (c, c++, java, etc...) & database (oracle, MySQL, etc...) they have used in achieving this. Since they have to manipulate with volume of data quickly and effectively. Though I'm not looking for their indepth architecture (if in case violates their company policies) an overview of all such things could be useful. Anybody please add you valuable suggestions and insight on this?

    Read the article

  • Data not displayed first time in android after copying the file from assets to data folder

    - by Thinkcomplete
    Description I have used the code tip from http://www.reigndesign.com/blog/using-your-own-sqlite-database-in-android-applications/ to copy a pre-filled data file to the target and handled this in a asynch task Problem : On starting the application it gives error and shuts down first time, starting again without any change it works perfectly fine. So first time after the file is copied, the error comes but after that no issues. Please help Code attached:.. private class CopyDatabase extends AsyncTask<String, Void, Boolean> { private final ProgressDialog dialog = new ProgressDialog(BabyNames.this); protected void onPreExecute() { this.dialog.setMessage("Loading..."); this.dialog.show(); } @Override protected Boolean doInBackground(String... params) { // TODO Auto-generated method stub try { namesDBSQLHelper.createDatabase(); return null; } catch(IOException ioe){ ioe.printStackTrace(); } return null; } protected void onPostExecute(final Boolean success){ if (this.dialog.isShowing()){ this.dialog.dismiss(); } } }

    Read the article

  • search engine (solr/sphinx) question

    - by noname
    i want to make my threads content searchable with full text search engines like solr. but i wonder one thing. should i index just the thread.title, thread.body and post.body or should i index username, created date, nr of posts, views, country, region and city too that belongs to thread? i mean when an user search for a thread he will get hits returned containing thread title, 2 lines of body, which user has posted it, creation date, tags, and so on. should i index all this information too? but then it would be pretty much the whole database. or should i just index the 3 first columns i mentioned for full text search. and another question. when an user post a new thread, then i have to immidiately tell solr to add that row? if im not, how would it be searchable?

    Read the article

  • Should I use integer primary IDs?

    - by arthurprs
    For example, I always generate an auto-increment field for the users table, but I also specify a UNIQUE index on their usernames. There are situations that I first need to get the userId for a given username and then execute the desired query, or use a JOIN in the desired query. It's 2 trips to the database or a JOIN vs. a varchar index. Should I use integer primary IDs? Is there a real performance benefit on INT over small VARCHAR indexes?

    Read the article

  • Connecting to hosted MySQL server with Java

    - by Infiniti Fizz
    Hi, I've been recently trying to connect to a hosted MySQL using Java but can't get it to work. I can connect to a local MySQL with localhost using: connect = DriverManager.getConnection("jdbc:mysql://localhost/lego?" + "user=******&password=*******"); (Replacing the astrisks withmy username and password) I can connect to the hosted MySQL database fine with PHP using: mysql_connect('mysql.hosts.co.uk','******','**********'); mysql_select_db('test'); My problem is, I cannot connect via Java. I have an Exception which is caught if the connection doesn't work and this is always printed out. Any ideas why it isn't working? Am I doing something wrong? Thanks for your time, InfinitiFizz

    Read the article

  • Invoicing vs Quoting or Estimating

    - by FreshCode
    If invoices can be voided, should they be used as quotations? I have an Invoices tables that is created from inventory associated with a Job or Order. I could have a Quotes table as a halfway-house between inventory and invoices, but it feels like I would have duplicate data structures and logic just to handle an "Is this a quote?" bit. From a business perspective, quotes are different from invoices: a quote is sent prior to an undertaking and an invoice is sent once it is complete and payment is due, but how to represent this in my repository and model. What is an elegant way to store and manage quotes & invoices in a database? Edit: indicated Job === Order for this particular instance.

    Read the article

  • Design review - do you think I'm doing this the right way? First commercial project for me!

    - by Sergio Tapia
    I'm tasked with designing an application that will allow a person to scan a legal document, save that associated with a Name and save it to a database. Now, inside of the Organization, there are many departments, and each department can have many sub departments. Problem lies in that some larger organizations will have many departments and smallers ones will only have 1 or two. I've though about creating a Department table and a Supdepartment table to create associations, etc. That way it's extensible and users can dynamically create departments to fit my program to their organizational scheme. Am I approaching this the right way? As I said, this is my first commercial application so I want to do it right and set a name for myself for delivering things on time and good code for other to expand upon.

    Read the article

  • Best way to gather, then import data into drupal?

    - by Frank
    I am building my first database driven website with Drupal and I have a few questions. I am currently populating a google docs excel spreadsheet with all of the data I want to eventually be able to query from the website (after it's imported). Is this the best way to start? If this is not the best way to start what would you recommend? My plan is to populate the spreadsheet then import it as a csv into the mysql db via the CCK Node. I've seen two ways to do this. http://drupal.org/node/133705 (importing data into CCK nodes) http://drupal.org/node/237574 (Inserting data using spreadsheet/csv instead of SQL insert statements) Basically my question(s) is what is the best way to gather, then import data into drupal? Thanks in advance for any help, suggestions.

    Read the article

  • Killing the mysqld process

    - by Josh K
    I have a table with ~800k rows. I ran an update users set hash = SHA1(CONCAT({about eight fields})) where 1; Now I have a hung Sequel Pro process and I'm not sure about the mysqld process. This is two questions: What harm can possibly come from killing these programs? I'm working on a separate database, so no damage should come to other databases on the system, right? Assume you had to update a table like this. What would be a quicker / more reliable method of updating without writing a separate script. I just checked with phpMyAdmin and it appears as though the query is complete. I still have Sequel Pro using 100% of both my cores though...

    Read the article

  • Timeout LinqToSql inserting millions of records

    - by Bas
    I'm inserting approximently 3 million records in a database using this solution. Eventually when the application has been inserting records for a while (my last run lasted around 4 hours), it gives a timeout with the following SqlException: "SqlExcepetion: Timeout expired. The timeoutperiod elapsed prior to completion of the operation or the server is not responding." What's the best way to handle this exception? Is there a way to prevent this from happening or should I catch the exception? Thanks in advance!

    Read the article

  • How can I synchronize database access between a write-thread and a read-thread?

    - by Runcible
    My program has two threads: Main execution thread that handles user input and queues up database writes A utility thread that wakes up every second and flushes the writes to the database Inside the main thread, I occasionally need to make reads on the database. When this happens, performance is not important, but correctness is. (In a perfect world, I would be reading from a cache, not making a round-trip to the database - but let's put that aside for the sake of discussion.) How do I make sure that the main thread sees a correct / quiescent database? A standard mutex won't work, since I run the risk of having the main thread grab the mutex before the data gets flushed to the database. This would be a big race condition. What I really want is some sort of mutex that lets the main thread of execution proceed only AFTER the mutex has been grabbed and released once. Does such a thing exist? What's the best way to solve this problem?

    Read the article

  • Who to contact regarding DBD::Advantage & bugs

    - by WarheadsSE
    I am in search of who specifically to contact at Sybase regarding Advantage Database Server's DBI driver, specifically DBD::Advantage. The only reference I can find is to one 'lancesc' in the README, but there are no references to a contact email, CPAN author etc. Inadvertantly I happened upon one StackOverflow user lancesc here. Would anyone happen to know who to contact regarding this? I do wish this was on CPAN. I've found a small bug regarding column quoting in the sql parser that they'd likely prefer to be made aware of. There are also several questions I have for them regarding failing functionality.

    Read the article

  • Revisions: algorithm and data structure

    - by SODA
    Hi, I need ideas for structuring and processing data with revisions. For example, I have a database of objects (e.g. cars). Each object has a number of properties, which can be arbitrary, so there's no a set schema to describe these objects. These objects are probably saved as key-value pairs. Now I need to change property of an object. I don't want to completely rewrite it - I want to be able to go back and see history of changes to these properties, that's why I want to add new property and keep the old one (so I guess a timestamp would do the job of telling which property is the latest). At the same time I want to be able to get info about any object in a snap, with only latest versions of each of the properties. Any ideas what would be the best approach? At least please point me in the right direction. Thanks!

    Read the article

< Previous Page | 299 300 301 302 303 304 305 306 307 308 309 310  | Next Page >