Search Results

Search found 18542 results on 742 pages for 'nhibernate query'.

Page 130/742 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • setfirstresult & setmaxresult in child collection

    - by Miguel Marques
    I have and entity lets call it Entity, and a Child collection Children. I have a screen where the user has the Entity information, and a list with the Children collection, but that collection can be get very big, so i was thinking about using paging: get the first 20 elements, and lazy load the next only if the user explicitly presses the next button. So i created in the Entity Repository a function with this signature: IEnumerable<Child> GetChildren(Entity entity, int actualPage, int numberOfRecordsPerPage) I need to use the setfirstresult and setmaxresult, not in the Agregate root Entity, but in the child collection. But when i use those two configurations, they allways refer to the entity type of the HQL/Criteria query. Other alternative would be to create a HQL/Criteria query for the Child type, set the max and first result, then filter the ones who are in the Entity Children collection (by using subquery). But i wasn't able to do this filter. If it was a bidirectional association (Child refering the parent Entity) it would be easier. Any suggestions? Any

    Read the article

  • Dates that intersect

    - by MikeAbyss
    Hi everyone, I've been researching this problem for awhile now and I can't seem to come to a solution, hopefully someone here can help. Currently I'm working with Microsoft SQL server management, I've been trying to do the following: Previously, the old query would just return the results that fit between two dates Heres the previous query: SELECT e.Name, o.StartDate, o.EndDate FROM dbo.Name e, dbo.Date o WHERE where e.Name = o.Name and o.StartDate <= '2010-09-28 23:59:59' and o.EndDate >= '2010-9-28 00:00:00' and e.Name like 'A' Example table that is produced after the query runs (The real table has a lot more rows obviously :P) : Name Start End A 2010-09-28 07:00:00 2010-09-28 17:00:00 A 2010-09-28 13:45:00 2010-09-28 18:00:00 A 2010-09-28 08:00:00 2010-09-28 16:00:00 A 2010-09-28 07:00:00 2010-09-28 15:30:00 However we need to change this, so that the query does the following: find the dates that intersect for a day x find the dates that don't intersect for a day x I've found a real useful site regarding this http://bloggingabout.net/blogs/egiardina/archive/2008/01/30/check-intersection-of-two-date-ranges-in-sql.aspx However the date to compare against is inputted, mine on the other hand has to all dates that intersect/don't intersect. Thanks for the help everyone.

    Read the article

  • Question about Benchmark function in Mysql ( Incredible results ).

    - by xRobot
    I have 2 tables: author with 3 millions of rows. book with 20 miles rows. . So I have benchmarked this query with a join: SELECT BENCHMARK(100000000, 'SELECT book.title, author.name FROM `book` , `author` WHERE book.id = author.book_id ') And this is the result: Query took 0.7438 sec ONLY 0.7438 seconds for 100 millions of query with a join ??? Do I make some mistakes or this is the right result ?

    Read the article

  • MySQL Update query with left join and group by

    - by Rob
    I am trying to create an update query and making little progress in getting the right syntax. The following query is working: SELECT t.Index1, t.Index2, COUNT( m.EventType ) FROM Table t LEFT JOIN MEvents m ON (m.Index1 = t.Index1 AND m.Index2 = t.Index2 AND (m.EventType = 'A' OR m.EventType = 'B') ) WHERE (t.SpecialEventCount IS NULL) GROUP BY t.Index1, t.Index2 It creates a list of triplets Index1,Index2,EventCounts. It only does this for case where t.SpecialEventCount is NULL. The update query I am trying to write should set this SpecialEventCount to that count, i.e. COUNT(m.EventType) in the query above. This number could be 0 or any positive number (hence the left join). Index1 and Index2 together are unique in Table t and they are used to identify events in MEvent. How do I have to modify the select query to become an update query? I.e. something like UPDATE Table SET SpecialEventCount=COUNT(m.EventType)..... but I am confused what to put where and have failed with numerous different guesses.

    Read the article

  • Session does not giving right records?

    - by Jugal
    I want to keep one session, but when I rollback transaction then transaction gets isActive=false, so I can not commit and rollback in next statements by using same transaction. then I need to create new transaction but what is going wrong here ? var session = NHibernateHelper.OpenSession();/* It returns new session. */ var transaction1 = session.BeginTransaction(); var list1 = session.Query<Make>().ToList(); /* It returs 4 records. */ session.Delete(list1[2]); /* After Rollback, transaction is isActive=false so I can not commit * and rollback from this transaction in future. so I need to create new transaction. */ transaction1.Rollback(); var transaction2 = session.BeginTransaction(); /* It returns 3 records. * I am not getting object(which was deleted but after that rollback) here why ? */ var list2 = session.Query<Make>().ToList(); Anyone have idea what is going wrong here ? I am not getting deleted object which was rollback.

    Read the article

  • SQL RDBMS : one query or multiple calls

    - by None None
    After looking around the internet, I decided to create DAOs that returned objects (POJOs) to the calling business logic function/method. For example: a Customer object with a Address reference would be split in the RDBMS into two tables; Customer and ADDRESS. The CustomerDAO would be in charge of joining the data from the two tables and create both an Address POJO and Customer POJO adding the address to the customer object. Finally return the fulll Customer POJO. Simple, however, now i am at a point where i need to join three or four tables and each representing an attribute or list of attributes for the resulting POJO. The sql will include a group by but i will still result with multiple rows for the same pojo, because some of the tables are joining a one to many relationship. My app code will now have to loop through all the rows trying to figure out if the rows are the same with different attributes or if the record should be a new POJO. Should I continue to create my daos using this technique or break up my Pojo creation into multiple db calls to make the code easier to understand and maintain?

    Read the article

  • mysql join with conditional

    - by Conor H
    Hi There, I am currently working on a MySQL query that contains a table: TBL:lesson_fee -fee_type_id (PRI) -lesson_type_id (PRI) -lesson_fee_amount this table contains the fees for a particular 'lesson type' and there are different 'fee names' (fee_type). Which means that there can be many entries in this table for one 'lesson type' In my query I am joining this table onto the rest of the query via the 'lesson_type' table using: lesson_fee INNER JOIN (other joins here) ON lesson_fee.lesson_type_id = lesson_type.lesson_type_id The problem with this is that it is currently returning duplicate data in the result. 1 row for every duplicate entry in the 'lesson fee' table. I am also joining the 'fee type' table using this 'fee_type_id' Is there a way of telling MySQL to say "Join the lesson_fee table rows that have lesson_fee.lesson_type_id and fee_type_id = client.fee_type_id". UPDATE: Query: SELECT lesson_booking.lesson_booking_id,lesson_fee.lesson_fee_amount FROM fee_type INNER JOIN (lesson_fee INNER JOIN (color_code INNER JOIN (employee INNER JOIN (horse_owned INNER JOIN (lesson_type INNER JOIN (timetable INNER JOIN (lesson_booking INNER JOIN CLIENT ON client.client_id = lesson_booking.client_id) ON lesson_booking.timetable_id = timetable.timetable_id) ON lesson_type.lesson_type_id = timetable.lesson_type_id) ON horse_owned.horse_owned_id = lesson_booking.horse_owned_id) ON employee.employee_id = timetable.employee_id) ON employee.color_code_id = color_code.color_code_id) ON lesson_fee.lesson_type_id = lesson_type.lesson_type_id) ON lesson_fee.fee_type_id = client.fee_type_id WHERE booking_date = '2010-04-06' ORDER BY lesson_booking_id ASC How do I keep the format(indentation) of my query?

    Read the article

  • MySql mutliple tables

    - by Chris Harrison
    I've been looking into JOIN, subqueries and other ways of doing this, but I can't work out the best way to do this is... I have a table (ps_category_product): id_product, id_category I want to perform a query on it like: SELECT id_product FROM ps_category_product WHERE id_category='$this_cat' BUT, I only want to perform this query where the ID's are returned by a query on another table (ps_product): id_product, active SELECT id_product FROM ps_product WHERE active='1' Can anyone help me with getting these two queries working together?

    Read the article

  • Creating ASP.NET 3.5 Admin Pages using a Query String in a Master Page

    This is the continuation of part one which ran on Tuesday of last week. After deleting the styles lt style type= text css gt in the head section of MasterPage.master you will need to create an external CSS to solve the relative URL problems in the website. This will ensure that the background images and links work according to the master page design. This article will explain how to do this and more.... Comcast? Business Class - Official Site Learn About Comcast Small Business Services. Best in Phone, TV & Internet.

    Read the article

  • sql select from a large number of IDs

    - by Claudiu
    I have a table, Foo. I run a query on Foo to get the ids from a subset of Foo. I then want to run a more complicated set of queries, but only on those IDs. Is there an efficient way to do this? The best I can think of is creating a query such as: SELECT ... --complicated stuff WHERE ... --more stuff AND id IN (1, 2, 3, 9, 413, 4324, ..., 939393) That is, I construct a huge "IN" clause. Is this efficient? Is there a more efficient way of doing this, or is the only way to JOIN with the inital query that gets the IDs? If it helps, I'm using SQLObject to connect to a PostgreSQL database, and I have access to the cursor that executed the query to get all the IDs.

    Read the article

  • SSMS: The Query Window Keyboard Shortcuts

    Simple-Talk's free wallchart of the most important SSMS keyboard shortcuts aims to help find all those curiously forgettable key combinations within SQL Server Management Studio that unlock the hidden magic that is available for editing and executing queries. The Future of SQL Server MonitoringMonitor wherever, whenever with Red Gate's SQL Monitor. See it live in action now.

    Read the article

  • OperationalError "unable to open database file" processing query results with SQLAlchemy and SQLite3

    - by Peter
    I'm running into this little problem that I hope is just a dumb user error. It looks like some sort of a size limit with a query to a SQLite database. I managed to reproduce the issue with an in-memory DB and a simple script shown below. I can make it work by either reducing the number of records in the DB; or by reducing the size of each record; or by dropping the order_by() call. I am using Python 2.5.5 and SQLAlchemy 0.6.0 in a Cygwin environment. Thanks! #!/usr/bin/python from sqlalchemy.orm import sessionmaker import sqlalchemy import sqlalchemy.orm class Person(object): def __init__(self, name): self.name = name engine = sqlalchemy.create_engine('sqlite:///:memory:') Session = sessionmaker(bind=engine) metadata = sqlalchemy.schema.MetaData(bind=engine) person_table = sqlalchemy.Table('person', metadata, sqlalchemy.Column('id', sqlalchemy.types.Integer, primary_key=True), sqlalchemy.Column('name', sqlalchemy.types.String)) metadata.create_all(engine) sqlalchemy.orm.mapper(Person, person_table) session = Session() session.add_all([Person("012345678901234567890123456789012") for i in range(5000)]) session.commit() persons = session.query(Person).order_by(Person.name).all() print "count =", len(persons) session.close() The all() call to the query result fails with the OperationalError exception: Traceback (most recent call last): File "./stress.py", line 27, in <module> persons = session.query(Person).order_by(Person.name).all() File "/usr/lib/python2.5/site-packages/sqlalchemy/orm/query.py", line 1343, in all return list(self) File "/usr/lib/python2.5/site-packages/sqlalchemy/orm/query.py", line 1451, in __iter__ return self._execute_and_instances(context) File "/usr/lib/python2.5/site-packages/sqlalchemy/orm/query.py", line 1456, in _execute_and_instances mapper=self._mapper_zero_or_none()) File "/usr/lib/python2.5/site-packages/sqlalchemy/orm/session.py", line 737, in execute clause, params or {}) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1109, in execute return Connection.executors[c](self, object, multiparams, params) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1186, in _execute_clauseelement return self.__execute_context(context) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1215, in __execute_context context.parameters[0], context=context) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1284, in _cursor_execute self._handle_dbapi_exception(e, statement, parameters, cursor, context) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1282, in _cursor_execute self.dialect.do_execute(cursor, statement, parameters, context=context) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/default.py", line 277, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.OperationalError: (OperationalError) unable to open database file u'SELECT person.id AS person_id, person.name AS person_name \nFROM person ORDER BY person.name' ()

    Read the article

  • 301 Redirect and query strings

    - by icelizard
    I am looking to create a 301 redirect based purely on a query string see b OLD URL: olddomain.com/?pc=/product/9999 New URL: newurl.php?var=yup My normal way of doing this would be redirect 301 pc=/product/9999 newurl.php?var=yup But this time I am trying to match a URL that that only contains the domain and a query string... What is the best way of doing this? Thanks

    Read the article

  • MySQL: how to enable Slow Query Log?

    - by Continuation
    Can you give me an example on how to enable MySQL's slow query log? According to the doc: As of MySQL 5.1.29, use --slow_query_log[={0|1}] to enable or disable the slow query log, and optionally --slow_query_log_file=file_name to specify a log file name. The --log-slow-queries option is deprecated. So how do I use that option? Can I put it in my.cnf? An example would be greatly appreciated. Thank you very much

    Read the article

  • HP Proliant Servers - WMI query for system health

    - by Mike McClelland
    Hi, I want to query lots of HP servers to determine their overall health. I don't want to use any packages, or even SNMP - I want to query the server health from WMI and understand if a box is Green/Amber/Red - just like the HP Management Home Page. This MUST be possible - but I can't find any documentation... Oh yes, and the servers are running Windows Server 2003/8. Help!! Mike

    Read the article

  • Slow Query log for just one database

    - by Jason
    can I enable the slow query log specifically for just one database? What I've done currently is to take the entire log into excel and then run a pivot report to work out which database is the slowest. So i've gone and done some changes to that application in the hope of reducing the slow query occurence. rather than running my pivot report again which took a bit of time to cleanse the data i'd rather just output slow queries from the one database possible?

    Read the article

  • How do I make a LDAP query-based dynamic distribution group in Exchange 2010

    - by blsub6
    I see that there were ways in Exchange 2003 and Exchange 2007 to just put in an LDAP query and it would populate the group for you. Is there any way to do that in Exchange 2010? I know there's dynamic distribution groups but I don't want to create the group based on one of their pre-set queries and I don't want to mess around with "custom attributes". I just want to put an LDAP query in there and make it run it to populate the distribution group.

    Read the article

  • Replace a SQL Server query with another before execution

    - by Kiranu
    I am trying to work with a legacy application in SQL Server which at some point does the following query SELECT serverproperty('EngineEdition') as sqledition The server replies with 2 (which is the correct edition), but the application closes since the app demands to be run over SQL Server Express which is 4. We don't have access to the code and the developer is long gone. Is there a way to configure SQL Server so that when this query is received it simply returns 4 and not the value of the property? Thanks

    Read the article

  • Mapping a child collection without indexing based on database primary key or using bag

    - by Colin Bowern
    I have a existing parent-child relationship I am trying to map in Fluent Nhibernate: [RatingCollection] -- [Rating] Rating Collection has: ID (database generated ID) Code Name Rating has: ID (database generated id) Rating Collection ID Code Name I have been trying to figure out which permutation of HasMany makes sense here. What I have right now: HasMany<Rating>(x => x.Ratings) .WithTableName("Rating") .KeyColumnNames.Add("RatingCollectionId") .Component(c => { c.Map(x => x.Code); c.Map(x => x.Name); ); It works from a CRUD perspective but because it's a bag it ends up deleting the rating contents any time I try to do a simple update / insert to the Ratings property. What I want is an indexed collection but not using the database generated ID (which is in the six digit range right now). Any thoughts on how I could get a zero-based indexed collection (so I can go entity.Ratings[0].Name = "foo") which would allow me to modify the collection without deleting/reinserting it all when persisting?

    Read the article

  • Modeling a Generic Relationship in a Database

    - by StevenH
    This is most likely one for all you sexy DBAs out there: How would I effieciently model a relational database whereby I have a field in an "Event" table which defines a "SportType". This "SportsType" field can hold a link to different sports tables E.g. "FootballEvent", "RubgyEvent", "CricketEvent" and "F1 Event". Each of these Sports tables have different fields specific to that sport. My goal is to be able to genericly add sports types in the future as required, yet hold sport specific event data (fields) as part of my Event Entity. Is it possible to use an ORM such as NHibernate / Entity framework which would reflect such a relationship? I have thrown together a quick C# example to express my intent at a higher level: public class Event<T> where T : new() { public T Fields { get; set; } public Event() { EventType = new T(); } } public class FootballEvent { public Team CompetitorA { get; set; } public Team CompetitorB { get; set; } } public class TennisEvent { public Player CompetitorA { get; set; } public Player CompetitorB { get; set; } } public class F1RacingEvent { public List<Player> Drivers { get; set; } public List<Team> Teams { get; set; } } public class Team { public IEnumerable<Player> Squad { get; set; } } public class Player { public string Name { get; set; } public DateTime DOB { get; set;} }

    Read the article

  • Lazy Loading Association and Casting

    - by Zuber
    I am using NHibernate 2.0.1 and .NET I am facing issues with Lazy loading an association I have a BusinessObject class that has associations to other BusinessObject in it, and it can go deeper. The following function is in the BusinessObject to read the values of a collection in the BusinessObject. public virtual object GetFieldValue(string fieldName) { var fieldItems = fieldName.Split(AppConstants.DotChar); var objectToRead = this; for (var i = 0; i < fieldItems.Length - 1; i++) { objectToRead = (BusinessObject) objectToRead.GetFieldValue(fieldItems[i]); } //if (objectToRead._data == null) return objectToRead.SystemId + " Error: _data was null"; return objectToRead.FieldValue(fieldName.LastItem()); } The FieldValue function is described below private object FieldValue(string fieldName) { return _data.Contains(fieldName) ? _data[fieldName] : null; } The BusinessObject has a dictionary_data which stores the field values. Assume the fieldName is BusinessDriver.Description and the BusinessObject which has this field is StrategyBusinessDriver This code breaks down the field name into two - BusinessDriver & Description. The first iteration reads the BusinessDriver object from StrategyBusinessDriver. It is cast into a BusinessObject type so that I can call the GetFieldValue again on it to read the next field i.e Description in the BusinessDriver. The problem is that when I read the BusinessDriver in the first iteration and cast it, I get the Ids and all other details of the BusinessObject but the field dictionary _data and other collections are not fetched. This should be fetched lazily when I read the _data of the BusinessObject. However, this does not happen and I get an error that _data is null. Is there something wrongly coded because of which the collection is not fetched lazily? Please ask for more clarifications if needed. Thanks in advance. UPDATE: It works when I don't do Lazy load.

    Read the article

  • MonoRail: Testing, Route Extensions, Folder Structures

    - by Kezzer
    I've got a few questions related to the use of MonoRail Testing Does everyone tend to use NUnit for their testing? I haven't worked enough with testing to know if this is a good testing framework to use. I'm just looking to get more into testing my applications a lot more than before and wanted to know if there's any general guidelines. Are you supposed to copy the controller over to a test area and just rename it with test in the name and re-run it? How do you ensure your test project and main project coincide with one another? Is it just a case of copying everything over again or are there tools available to do it for you? Route Extensions MonoRail tends to use <action>.rails, can you omit the .rails part if you configure your routing correctly? Why does this seem to be the standard? Folder Structures I haven't found anywhere which really points out your standard folder structure. Sure, you have Controllers, Models, and Views. But your Models folder should contain your data access objects as well. I've seen some have something like -> Models -> DaoClasses -> Entities But what about custom structures used to get data out of views? And if you're using NHibernate, where's a good place to stick the mappings? I know it's entirely dependent on the developer, but I haven't really seen any standard approach. Cheers

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >