Search Results

Search found 31499 results on 1260 pages for 'database theory'.

Page 326/1260 | < Previous Page | 322 323 324 325 326 327 328 329 330 331 332 333  | Next Page >

  • Virtual directories as DB queries

    I have a site, e.g. site.com I would like users to be able to access it in their locale at site.com/somecity This is similar to craigslist, but they do it with subdomains e.g. sfbay.craigslist.org Using Apache HTTP server. MySql for DB. If you can provide a brief explanation and perhaps links to more thorough discussions, I would be quite interested in learning. I'm developing a web-app and wonder if I should focus some time to read up on Apache, or should I focus more of my time on server-side programming. Thanks!

    Read the article

  • sqlserver how to set job priority

    - by Buzz
    Is there any way to set one job priority higher then other, In my case there are two jobs those are working on same set of tables,first JOB-A which is running every 12 hr and other JOB-B is every 10 minutes , i think at some time when they run simultaneously JOB-B is getting in to deadlock and get failed, i google the topic and found sqlgoverner is helpful in such cases does anyone know how to resolve?

    Read the article

  • Fastest way to store/retrieve a dictionary - SQL, text file...?

    - by AP257
    Hi all, This is a really really super dumb question, so I apologise, but I'd be grateful for some advice. I've got a text file of words and word frequencies. It's very large - theoretically we're talking millions of rows. I just want to retrieve values from the file, and do it as quickly and efficiently as possible (for a web app, in Django). My question is: what is the best way to store and retrieve the values? Should import them into SQL? Or keep the file and use grep? Or put them into a JSON dictionary...? Or some other way? Sorry for the dumb question, would be very grateful for advice!

    Read the article

  • Fluent nHibernate - How to map a non-key column on a junction table?

    - by The Matt
    Taking an example that is provided on the Fluent nHibernate website, I need to extend it slightly: I need to add a 'Quantity' column to the StoreProduct table. How would I map this using nHibernate? An example mapping is provided for the given scenario above, but I'm not sure how I would get the Quantity column to map to a property on the Product class: public class StoreMap : ClassMap<Store> { public StoreMap() { Id(x => x.Id); Map(x => x.Name); HasMany(x => x.Employee) .Inverse() .Cascade.All(); HasManyToMany(x => x.Products) .Cascade.All() .Table("StoreProduct"); } }

    Read the article

  • Walking through an SQLite Table

    - by galford13x
    I would like to implement or use functionality that allows stepping through a Table in SQLite. If I have a Table Products that has 100k rows, I would like to retrive perhaps 10k rows at a time. Somthing similar to how a webpage would list data and have a < Previous .. Next > link to walk through the data. Are there select statements that can make this simple? I see and have tried using the ROWID in conjunction with LIMIT which seems ok if not ordering the data. // This seems works if not ordering. SELECT * FROM Products WHERE ROWID BETWEEN x AND y;

    Read the article

  • Fastest way to do a weighted tag search in SQL Server

    - by Hasan Khan
    My table is as follows ObjectID bigint Tag nvarchar(50) Weight float Type tinyint I want to get search for all objects that has tags 'big' or 'large' I want the objectid in order of sum of weights (so objects having both the tags will be on top) select objectid, row_number() over (order by sum(weight) desc) as rowid from tags where tag in ('big', 'large') and type=0 group by objectid the reason for row_number() is that i want paging over results. The query in its current form is very slow, takes a minute to execute over 16 million tags. What should I do to make it faster? I have a non clustered index (objectid, tag, type) Any suggestions?

    Read the article

  • Synchronizing MySQL databases

    - by Wasim
    Hi all, I have a maintanance problem synchronizing my MySQL data bases . These are the databases I have : My development DB : Here I make my curret development changes . Staging DB : I need to make all the changes I did in the development on it before using , currently I hold migration scripts for structure and data. Production DB : A production environment . Have to do exacly the same as the staging . My problem , is sync. the structure , and some of the data. This is really a very hard work to maintaine. Is there any technics , tools to do with MySQL . What is replication , is it good for my situation , how to use it . Thanks in advance ...

    Read the article

  • How to query range of data in DB2 with highest performance?

    - by Fuangwith S.
    Usually, I need to retrieve data from a table in some range; for example, a separate page for each search result. In MySQL I use LIMIT keyword but in DB2 I don't know. Now I use this query for retrieve range of data. SELECT * FROM( SELECT SMALLINT(RANK() OVER(ORDER BY NAME DESC)) AS RUNNING_NO , DATA_KEY_VALUE , SHOW_PRIORITY FROM EMPLOYEE WHERE NAME LIKE 'DEL%' ORDER BY NAME DESC FETCH FIRST 20 ROWS ONLY ) AS TMP ORDER BY TMP.RUNNING_NO ASC FETCH FIRST 10 ROWS ONLY but I know it's bad style. So, how to query for highest performance?

    Read the article

  • How to find the entity with the greatest primary key?

    - by simpatico
    I've an entity LearningUnit that has an int primary key. Actually, it has nothing more. Entity Concept has the following relationship with it: @ManyToOne @Size(min=1,max=7) private LearningUnit learningUnit; In a constructor of Concept I need to retrieve the LearningUnit with the greatest primary key. If no LearningUnit exists yet I instantiate one. I then set this.learningUnit to the retrieved/instantied. Finally, I call the empty constructor of Concept in a try-catch block, to have the entitymanager do the cardinality check. If an exception is thrown (I expect one in the case that already another 7 Concepts are referring to the same LearningUnit. In that case, I case instantiate a new LearningUnit with a new greater primary key. Please, also point out, if any, clear pitfalls in my outlined algorithm above.

    Read the article

  • Mysql- "FLUSH TABLES WITH READ LOCK" started automatically

    - by ming yeow
    I would like to understand how this happened. I was running a query that would take a long time, but should not lock up any table. However, my dbs were practically down - it seems like it was being locked up by "FLUSH TABLES WITH READ LOCK" 03:21:31 select type_id, count(*) from guid_target_infos group by type_id 02:38:11 select type_id, count(*) from guid_infos group by type_id 02:24:29 FLUSH TABLES WITH READ LOCK But i did not start this command. can someone tell me why it was started automatically?

    Read the article

  • Alter multiple tables' columns length

    - by gdoron
    So, we just found out that 254 tables in our Oracle DBMS have one column named "Foo" with the wrong length- Number(10) instead of Number(3). That foo column is a part from the PK of the tables. Those tables have other tables with forigen keys to it. What I did is: backed-up the table with a temp table. Disabled the forigen keys to the table. Disabled the PK with the foo column. Nulled the foo column for all the rows. Restored all the above But now we found out it's not just couple of tables but 254 tables. Is there an easy way, (or at least easier than this) to alter the columns length? P.S. I have DBA permissions.

    Read the article

  • Can't create a MySQL query that generates 4 rows for each row in the table it references.

    - by UkraineTrain
    I need to create a MySQL query that generates 4 rows for each row in the table it references. I need some of the information in those rows to repeat and some to be different. In the table each row stands for one day. I need to break the day up in 6 hour increments, hence the four rows for each entry. I need to create one column which for each day will have '12AM', '6AM', '12PM', and '6PM' values and another column will have the corresponding numeric values calculated for those entries. Thanks a lot in advance and I will really appreciate any help on this.

    Read the article

  • Efficient update of SQLite table with many records

    - by blackrim
    I am trying to use sqlite (sqlite3) for a project to store hundreds of thousands of records (would like sqlite so users of the program don't have to run a [my]sql server). I have to update hundreds of thousands of records sometimes to enter left right values (they are hierarchical), but have found the standard update table set left_value = 4, right_value = 5 where id = 12340; to be very slow. I have tried surrounding every thousand or so with begin; .... update... update table set left_value = 4, right_value = 5 where id = 12340; update... .... commit; but again, very slow. Odd, because when I populate it with a few hundred thousand (with inserts), it finishes in seconds. I am currently trying to test the speed in python (the slowness is at the command line and python) before I move it to the C++ implementation, but right now this is way to slow and I need to find a new solution unless I am doing something wrong. Thoughts? (would take open source alternative to SQLite that is portable as well)

    Read the article

  • PHP reporting error. DB verify how to?

    - by iamfab
    Error reporting Notice: Undefined variable: random_chars in wamp\www\php_sandbox\idgen.php on line 21 Call Stack: # Time Memory Function Location 1 0.0045 678928 {main}( ) ..\idgen.php:0 GPB7446 How do I fix this error? Using this code like an automatic unique id generator. How do I connect to DB to verify code is truly unique before allowing it to be assigned to a user creating a new account? Thanks <?php $characters = array( "A","B","E","F","G","H","J","K","M","N","P","R","S","T","W","X","Y","Z"); $keys = array(); while(count($keys) < 3) { $x = mt_rand(0, count($characters)-1); if(!in_array($x, $keys)) { $keys[] = $x; } } foreach($keys as $key){ $random_chars .= $characters[$key];} $randNum = rand(2327,9987); $randLet = rand(2327,9987); echo $random_chars . $randNum; ?>

    Read the article

  • Hosting an Access DB

    - by Mitciv
    Hey, So I'm inexperienced in hosting DB's and I've always had the luxury of someone else getting the db setup. I was going to help a friend out with getting a webpage setup, I've got experience in Asp.Net MVC so I'm going with that. They want to setup a search page to query a db and display the results. My question I have is in getting the DB setup and hosted. They currently just have the Access DB on a local computer. There is basically only one table that would need to be queried for the search. What is the best approach to getting this table/db accessible? They would like to keep the main copy of the db on the local machine, so copying the entire db over to the hosted site would be time consuming, could the lone table needed be solely copied to the host? Should I try to convince them to make changes on the hosted db and just make copies of that for their local machines? Any suggestions are welcome, Again I'm a total noob when it comes to hosting databases. Thanks

    Read the article

  • How to undo SQL changes using installer

    - by Sunil Agarwal
    I have installer to install procedures, scripts, views, etc in SQL server 2005/2008. Now I want to add a condition in the installer like if there is any error while installing, I want to undo all the changes done in SQL server. I tried to store the procedures, views, etc which I am changing while installing and reverting them back if I get any error. But am not able to do it the way I want. Can someone guide me if he had done the same thing? To specify I am using WIX installer. Also if someone has tried SMO, it will be of great help.

    Read the article

  • how to get transform a local image to a web accessable image

    - by hguser
    Hi: Generally,if we want to display a image in the web page,we give the uri of the image resource like: http://host:port/image/xxx.jpg. Now,there are some images in my file system,and I save its absolute path in the db. Like this; id name address image 1 xxxx xxxx C:/images/xxx.jpg Now if the entity is retrived,its image should be displayed in the page. How to make it? What I thought is copy the image under the web server dir,then build its url,then the page can render it. But I wonder if this is a good idea? Is there any other way?

    Read the article

  • mysql circular dependency in foreign key constraints

    - by Flavius
    Given the schema: What I need is having every user_identities.belongs_to reference an users.id. At the same time, every users has a primary_identity as shown in the picture. However when I try to add this reference with ON DELETE NO ACTION ON UPDATE NO ACTION, MySQL says #1452 - Cannot add or update a child row: a foreign key constraint fails (yap.#sql-a3b_1bf, CONSTRAINT #sql-a3b_1bf_ibfk_1 FOREIGN KEY (belongs_to) REFERENCES users (id) ON DELETE NO ACTION ON UPDATE NO ACTION) I suspect this is due to the circular dependency, but how could I solve it (and maintain referential integrity)?

    Read the article

  • Question about Benchmark funcion in Mysql ( Incredible results ).

    - by xRobot
    I have 2 tables: author with 3 millions of rows. book with 20 miles rows. . So I have benchmarked this query with a join: SELECT BENCHMARK(100000000, 'SELECT book.title, author.name FROM `book` , `author` WHERE book.id = author.book_id ') And this is the result: Query took 0.7438 sec ONLY 0.7438 seconds for 100 millions of query with a join ??? Do I make some mistakes or this is the right result ?

    Read the article

< Previous Page | 322 323 324 325 326 327 328 329 330 331 332 333  | Next Page >