Search Results

Search found 5233 results on 210 pages for 'records'.

Page 139/210 | < Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >

  • gridview check duplicates not using sql

    - by Tomasusa
    I have a code: foreach (GridViewRow dr in gvCategories.Rows)<br/> { <br/> if (dr.Cells[0].Text == txtEnterCategory.Text.Trim())<br/> <br/> isError=true; <br/> <br/> } Debugging: dr.Cells[0].Text is always "", even there are records. How to use a loop to check each row to find if a record exists in the gridview not using sql?

    Read the article

  • Can you add identity to existing column in sql server 2008?

    - by bmutch
    In all my searching I see that you essentially have to copy the existing table to a new table to chance to identity column for pre-2008, does this apply to 2008 also? thanks. most concise solution I have found so far: CREATE TABLE Test ( id int identity(1,1), somecolumn varchar(10) ); INSERT INTO Test VALUES ('Hello'); INSERT INTO Test VALUES ('World'); -- copy the table. use same schema, but no identity CREATE TABLE Test2 ( id int NOT NULL, somecolumn varchar(10) ); ALTER TABLE Test SWITCH TO Test2; -- drop the original (now empty) table DROP TABLE Test; -- rename new table to old table's name EXEC sp_rename 'Test2','Test'; -- see same records SELECT * FROM Test;

    Read the article

  • Subsonic 3.0 - .Net - Error : Can not ceate an instance of an interface

    - by George
    Hi, I am new to Subsonic, I have configured Subsonic3.0 T4 Template and created classes for my project. I have taken GridView and Object Datasource. Object datasource will connect to the one of the classes which is created from subsonic. I have set object datasource for g\fetch, Insert, Update and delete methods. Then i set the datasource of grid witht he object datasource. Grid view successfully showing me all the records. But at the time of Update, Insert or delete it throws an exception that "Can not ceate an instance of an interface". And also i am not able to dwbug in the code of the subsonic. May be because of partial classes. Can anyone please let me know what is happening at the backgrund? Or may be one can give me sample example which contains subsonic 3.0 and grid add, edit and delete so it will be really helpful for me.... Please... :) Thanks, George

    Read the article

  • Transactional isolation level needed for safely incrementing ids

    - by Knut Arne Vedaa
    I'm writing a small piece of software that is to insert records into a database used by a commercial application. The unique primary keys (ids) in the relevant table(s) are sequential, but does not seem to be set to "auto increment". Thus, I assume, I will have to find the largest id, increment it and use that value for the record I'm inserting. In pseudo-code for brevity: id = select max(id) from some_table id++ insert into some_table values(id, othervalues...) Now, if another thread started the same transaction before the first one finished its insert, you would get two identical ids and a failure when trying to insert the last one. You could check for that failure and retry, but a simpler solution might be setting an isolation level on the transaction. For this, would I need SERIALIZABLE or a lower level? Additionally, is this, generally, a sound way of solving the problem? Are the any other ways of doing it?

    Read the article

  • get location(lat/long) without gps just like my location feature of google maps

    - by Suriyan Suresh
    Get location(lat/long) without GPS, just like my location feature in Google maps. I have Google Maps in my mobile (Sony Ericsson G502 without GPS). It works fine without GPS in India. 1.How Google finds my position? 2. When i am searching cellid in opencellid database, it has less number of records for India. but Google Maps works fine in my mobile(India) 3.Is Google uses opencellid database or its own?. if Google uses its own, shall we have access to it database

    Read the article

  • svn track brand new code base

    - by Fire Crow
    I'm at a company, we keep recieviing new codebases from a third party vendor. we'd like to track the changes in subversion. is there a way to replace a branch with the new code and track the changes? currently we just delete all files in the branch, and then add the new files and commit. we'd like to track the files, but I havn't found a tool that will easily deal with all the .svn directories found in subfolders. does anyone know a tool that will replace an svn directory with a new branch and create the respective modify add and delete records as if the code base was organically modified?

    Read the article

  • How to deny payment via PayPal IPN?

    - by Nick
    Hello all, I need to create dynamic 'Pay Now' buttons on my site, and PayPal says the way to do this is via an HTML FORM with preset variables for the price, currency, and item of the purchase. I use PayPal IPN to notify me when a payment has complete. However, what's to stop someone from modifying the query parameters of the Pay Now button to change the price? Some people have told me to redirect the button through a PHP file that sends you to a PayPal payment page with the parameters in place, but the price could just as easily be manipulated in the Web browser's address bar. My question is, how can I deny a payment if the information I receive from PayPal's IPN service is invalid (if the price doesn't match our records)? I'm quite confused and couldn't find any documentation on what I'm looking for. Hopefully, you guys can help. Thanks!

    Read the article

  • Need to work out database structure

    - by jim smith
    Hi, Just need a little kickstart with this. I have Mysql/PHP, and I have 5,000 products. I have 30 companies I need to store some data for those 30 companies for each product as follows: a) prices b) stock qty I also need to store data historically on a daily basis. So the table... It makes sense that the records will be the products because there's 5000, and if I put the companies as the columns, I can store the prices, but what about the stock quantities? I could create two columns for each compoany, one for prices, one for qty. Then make the tablename the date for that day...so theer would be a new table for every day with 5000 products in it? is this the correct way? Some idea on how I'll be retreiving data the top 5 lowest prices (and the company) by product for a certain date the price and stock changes in the past 7 days by product

    Read the article

  • Facebook Graph - does the user "like" my page?

    - by Ican Zilb
    I am trying to upgrade my iPhone app to use the new facebook graph api. One thing I cannot find is how to find out if the current user connected from my app to facebook is a fan of my facebook page - (i.e. in the new paradigm - whether the user likes my page) In the Rest Api there was a function isFan, but not in the Graph. I can get all items the user likes and search whether one of them is my page, but certainly there must be an easier way instead of going trough thousands of records each time I must check whether he is a fan, right? If someone already figured it out how to do that from their new docs, I'll really appreciate if you share it with me.

    Read the article

  • SQl rows to columns conversion

    - by Thihara
    Hi, I have a table ClassAttendance and I'm using MSSQL 2005 studentID--attendanceDate---------------------------------------status 1004--------2010-03-17--------------------------------------------------0 1005--------2010-03-17--------------------------------------------------1 1006--------2010-03-17--------------------------------------------------0 1007--------2010-03-17--------------------------------------------------0 1004--------2010-03-19--------------------------------------------------0 1005--------2010-03-19--------------------------------------------------1 1006--------2010-03-19--------------------------------------------------0 1007--------2010-03-19--------------------------------------------------0 1004--------2010-03-20--------------------------------------------------1 as you can see studentID is a foreign Key for a table called StudentData and attendedDate has an unknown number of rows. Can i get the output like below by using a query? I need the dates in one month to be columns and the value of the date columns will be values in the status column. The number of date records per studentID is the same its the number of dates in the attendanceDate filed that is unknown. studentID---------2010-03-17--------2010-03-19------2010-03-20 1004-----------------------------0----------------------0--------------------1 etc. This is for a creating a report so I need to do it in a query. Please help if you can.

    Read the article

  • Two Tables Serving as one Model in Rails

    - by matsko
    Is is possible in rails to setup on model which is dependant on a join from two tables? This would mean that for the the model record to be found/updated/destroyed there would need to be both records in both database tables linked together in a join. The model would just be all the columns of both tables wrapped together which may then be used for the forms and so on. This way when the model gets created/updated it is just one form variable hash that gets applied to the model? Is this possible in Rails 2 or 3?

    Read the article

  • A table that has relation to itself issue

    - by Mostafa
    Hi , I've defined table with this schema : CREATE TABLE [dbo].[Codings]( [Id] [int] IDENTITY(1,1) NOT NULL, [ParentId] [int] NULL, [CodeId] [int] NOT NULL, [Title] [nvarchar](50) COLLATE Arabic_CI_AI NOT NULL, CONSTRAINT [PK_Codings] PRIMARY KEY CLUSTERED ( [Id] ASC )WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY] ) ON [PRIMARY] And fill it up with data like this : Id ParentId CodeId Title ----------- ----------- ----------- ---------- 1 NULL 0 Gender 2 1 1 Male 3 1 2 Female 4 NULL 0 Educational Level 5 4 1 BS 6 4 2 MS 7 4 3 PHD Now , I'm looking for a solution , in order , When i delete a record that is parent ( like Id= 1 or 4 ), It delete all child automatically ( all records that their ParentId is 1 or 4 ) . I supposed i can do it via relation between Id and Parent Id ( and set cascade for delete rule ) , But when i do that in MMS , the Delete Rule or Update Rule in Properties is disabled . My question is , What can i do to accomplish this ? Thank you

    Read the article

  • VB working with SQL DB - end of row count, keeps looping

    - by Tramd
    I'm adding to a combo box an ID and a name that i'm pulling from a database. My problem is that for some reason my loop doesnt end once it reaches the end of the records in the database table. Here's my code: For intcount = 0 To dtOrders.Rows.Count - 1 cmbSearch.Items.Add(dtOrders.Rows(intcount)("EmployeeID").ToString & " " & dtOrders.Rows(intcount)("EmployeeLastName").ToString & ", " & dtOrders.Rows(intcount)("EmployeeFirstName").ToString) Next Shouldnt the .rows.count - 1 stop it once it reaches the last record? It loops 4 times through.

    Read the article

  • Monitor database table for external changes from within Rails application

    - by jhwist
    I'm integrating some non-rails-model tables in my Rails application. Everything works out very nicely, the way I set up the model is: class Change < ActiveRecord::Base establish_connection(ActiveRecord::Base.configurations["otherdb_#{RAILS_ENV}"]) set_table_name "change" end This way I can use the Change model for all existing records with find etc. Now I'd like to run some sort of notification, when a record is added to the table. Since the model never gets created via Change.new and Change.save using ActiveRecord::Observer is not an option. Is there any way I can get some of my Rails code to be executed, whenever a new record is added? I looked at delayed_job but can't quite get my head around, how to set that up. I imagine it evolves around a cron-job, that selects all rows that where created since the job last ran and then calls the respective Rails code for each row.

    Read the article

  • Alternative databases to use when putting IIS Logs into a database using LogParser

    - by Robin Day
    We have run some scripts that use LogParser to dump our IIS logs into a SQL Server database. We can then query this to get simple stats on hits, usage etc. It's also good when linking it to error log databases and performance counter database to compare usage with errors, etc. Having implemented this for just one system and for the last 2-3 weeks we already have a 5GB database with around 10 million records. This is making any queries to this database quite slow and will no doubt cause storage issues if we continue to log as we are. Can anyone suggest any alternative databases that we could use for this data that would be more efficient for such logs? I'd be particularly interested in any experience of Google's BigTable or Amazon's SimbleDB. Are either of these suitable for reporting queries? COUNTs, GROUP BYs, PIVOTs?

    Read the article

  • How to control order of assignment for new identity column in mssql ?

    - by alpav
    I have a table with CreateDate datetime field default(getdate()) that does not have any identity column. I would like to add identity(1,1) field that would reflect same order of existing records as CreateDate field (order by would give same results). How can I do that ? I guess if I create clustered key on CreateDate field and then add identity column it will work (not sure if it's guaranteed), is there a good/better way ? I am interested in sql 2005, but I guess the answer will be the same for sql 2008, sql 2000.

    Read the article

  • bad performance from too many caught errors?

    - by Christopher Klein
    I have a large project in C# (.NET 2.0) which contains very large chunks of code generated by SubSonic. Is a try-catch like this causing a horrible performance hit? for (int x = 0; x < identifiers.Count; x++) {decimal target = 0; try { target = Convert.ToDecimal(assets[x + identifiers.Count * 2]); // target % } catch { targetEmpty = true; }} What is happening is if the given field that is being passed in is not something that can be converted to a decimal it sets a flag which is then used further along in the record to determine something else. The problem is that the application is literally throwing 10s of thousands of exceptions as I am parsing through 30k records. The process as a whole takes almost 10 minutes for everything and my overall task is to improve that time some and this seemed like easy hanging fruit if its a bad design idea. Any thoughts would be helpful (be kind, its been a miserable day) thanks, Chris

    Read the article

  • The width of a list that contains an embedded matrix grows unexpectedly

    - by Greg Lorenz
    I have a report in reporting services 2005 that includes a list with an embedded matrix and am attempting to put a border on the list, however, when I run the report in visual studio the matrix is growing past the end of the page and therefore the boarder is growing with it causing it to grow into the second page. It appears that there was supposedly a fix for this in reporting services 2000 service pack 1 but I am still experianceing the issue in 2005. The list has a details group that limits the records on a row to 4 using the following expression: =ceiling(rownumber("list1")/4), the matrix has a column group that should recycle those based on the rownumber determined by the list using the following expression: =rowNumber("list1_Details_Group") I have also attempted to put the list in a rectangle in hopes to stop the matrix from growing to no avail. How do I effectivly stop the matrix form growing past the space allowed by the list control?

    Read the article

  • Django Grouping Query

    - by Matt
    I have the following (simplified) models: class Donation(models.Model): entry_date = models.DateTimeField() class Category(models.Model): name = models.CharField() class Item(models.Model): donation = models.ForeignKey(Donation) category = models.ForeignKey(Category) I'm trying to display the total number of items, per category, grouped by the donation year. I've tried this: Donation.objects.extra(select={'year': "django_date_trunc('year', %s.entry_date)" % Donation._meta.db_table}).values('year', 'item__category__name').annotate(items=Sum('item__quantity')) But I get a Field Error on item__category__name. I've also tried: Item.objects.extra(select={"year": "django_date_trunc('year', entry_date)"}, tables=["donations_donation"]).values("year", "category__name").annotate(items=Sum("quantity")).order_by() Which generally gets me what I want, but the item quantity count is multiplied by the number of donation records. Any ideas? Basically I want to display this: 2010 - Category 1: 10 items - Category 2: 17 items 2009 - Category 1: 5 items - Category 3: 8 items

    Read the article

  • Logging into table in MS SQL trigger

    - by Martin
    I am coding MS SQL 2005 trigger. I want to make some logging during trigger execution, using INSERT statement into my log table. When there occurs error during execution, I want to raise error and cancel action that cause trigger execution, but not to lose log records. What is the best way to achieve this? Now my trigger logs everything except situation when there is error - because of ROLLBACK. RAISERROR statement is needed in order to inform calling program about error. Now, my error handling code looks like: if (@err = 1) begin INSERT INTO dbo.log(date, entry) SELECT getdate(), 'ERROR: ' + out from #output RAISERROR (@msg, 16, 1) rollback transaction return end

    Read the article

  • Algorithm for Search page

    - by Geetha
    Hi All, I am creating a search page where we can find the product by entering the text. ex: Brings on the night. My query bring the records which contain atleast word from this. Needs: 1. First row should contains the record with the given sentence. 2. Second most matching....etc How to achieve this. Is there any algorithm for this. It will be more helpful if anyone share your idea. Geetha

    Read the article

  • Catching constraint violations in JPA 2.0.

    - by Dennetik
    Consider the following entity class, used with, for example, EclipseLink 2.0.2 - where the link attribute is not the primary key, but unique nontheless. @Entity public class Profile { @Id private Long id; @Column(unique = true) private String link; // Some more attributes and getter and setter methods } When I insert records with a duplicate value for the link attribute, EclipseLink does not throw a EntityExistsException, but throws a DatabaseException, with the message explaining that the unique constraint was violated. This doesn't seem very usefull, as there would not be a simple, database independent, way to catch this exception. What would be the advised way to deal with this? A few things that I have considered are: Checking the error code on the DatabaseException - I fear that this error code, though, is the native error code for the database; Checking the existence of a Profile with the specific value for link beforehand - this obviously would result in an enormous amount of superfluous queries.

    Read the article

  • Can a .csv file be used as a data source in Visual Studio 2008?

    - by Kevin
    I'm pretty new to C# and Visual Studio. I'm writing a small program that will read a .csv file and then write the records read to a SQL Server database table. I can manually parse the .csv file, but I was wondering if it is possible to somehow "describe" the .csv file to Visual Studio so that I can use it as a data source? I should mention that the first two lines in the .csv file contain header information and the following lines are the actual comma-delimited data. Also, I should mention that this program is a stand-alone console program with no user interface.

    Read the article

  • Core Data table record count

    - by user339633
    I have an entity called Person and it has a relationship called participatingGames, to another entity called GameParticipant. I (apparently) can retrieve the number of matches in the GameParticipant entity using this simple code in the Person object I created from the entity in the model: [self.participatingGames count]; However, I'd just like to retrieve the number of Person records and one might guess the syntax for this is just as simple. I have lots of books including those by Jeff LaMarche, but those sources and what I find around here make me wonder if I need to set up a fetchedResultsController just to know the count of some entity. My background is in SQL, so of course it seems odd that what would take 15 seconds to code in any other environment seems like such a well-guarded secret in Core Data. I'm using iPhone SDK 3.1.4 under OSX 10.5.8 Suggestions?

    Read the article

  • Scraping paginated items from a website using scrapy

    - by Mridang Agarwalla
    I'm using scrapy to scrape items from a site. I'm not being able to implement this scraping pattern. The site I'm trying to scrape is a forum and I scrape the site once a day. Each page has a table containing posts. New posts are added to the top of the table and as more and more posts are posted to the site, the older posts go further into the pages due to pagination. This is a very simple scenario and we will assume that the order of the posts never change. I would like to scrape this site and scrape all the "new" records until the last scraped post from yesterday is encountered. I have configured my spider to paginate endlessly and when it encounters yesterday's last scraped post, it should stop. How can implement this? (My Scrapy installation works with my Django installation using django-dynamic-scraper )

    Read the article

< Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >