Search Results

Search found 14448 results on 578 pages for 'schema org'.

Page 401/578 | < Previous Page | 397 398 399 400 401 402 403 404 405 406 407 408  | Next Page >

  • Hierarchical/Nested Database Structure for Comments

    - by Stephen Melrose
    Hi, I'm trying to figure out the best approach for a database schema for comments. The problem I'm having is that the comments system will need to allow nested/hierarchical comments, and I'm not sure how to design this out properly. My requirements are, Comments can be made on comments, so I need to store the tree hierarchy I need to be able to query the comments in the tree hierarchy order, but efficiently, preferably in a fast single query, but I don't know if this is possible I'd need to make some wierd queries, e.g. pull out the latest 5 root comments, and a maximum of 3 children for each one of those I read an article on the MySQL website on this very subject, http://dev.mysql.com/tech-resources/articles/hierarchical-data.html The "Nested Set Model" in theory sounds like it will do what I need, except I'm worried about querying the thing, and also inserting. If this is the right approach, How would I do my 3rd requirement above? If I have 2000 comments, and I add a new sub-comment on the first comment, that will be a LOT of updating to do. This doesn't seem right to me? Or is there a better approach for the type of data I'm wanting to store and query? Thank you

    Read the article

  • Parameter cannot be null error after success save

    - by tigermain
    I am getting the following nhibernate error when saving an entity (via: NHibernateSession.Save(entity);) despite it being persisted to the database fine "Value cannot be null.\r\nParameter name: id" This is my hbm file <?xml version="1.0" encoding="utf-8" ?> <hibernate-mapping xmlns="urn:nhibernate-mapping-2.2" namespace="JeanieMaster.Domain.Entities" assembly="JeanieMaster.Domain"> <class name="ActionLog" table="ActionLog" schema="[DBSVR1].[mydatabase].[dbo]" select-before-update="false" optimistic-lock="none"> <id name="Id" column="ActionLogId" type="Int32"> <generator class="identity"/> </id> <property name="ActionId" type="Int32"/> <many-to-one name="User" class="JeanieUser" column="UserId" /> <many-to-one name="ApplicationProvider" class="ApplicationProvider" column="ApplicationProviderId" /> <many-to-one name="ContentProvider" class="ContentProvider" column="ContentProviderId" /> <many-to-one name="SearchLog" class="SearchLog" column="SearchLogId" /> <property name="Data" type="string"/> <property name="DateCreated" type="DateTime"/> <property name="ActionDuration" type="Double"/> </class> </hibernate-mapping>

    Read the article

  • How to export Oracle statistics

    - by A_M
    Hi, I am writing some new SQL queries and want to check the query plans that the Oracle query optimiser would come up with in production. My development database doesn't have anything like the data volumes of the production database. How can I export database statistics from a production database and re-import them into a development database? I don't have access to the production database, so I can't simply generate explain plans on production without going through a third party hosting organisation. This is painful. So I want a local database which is in some way representative of production on which I can try out different things. Also, this is for a legacy application. I'd like to "improve" the schema, by adding appropriate indexes. constraints, etc. I need to do this in my development database first, before rolling out to test and production. If I add an index and re-generate statistics in development, then the statistics will be generated around the development data volumes, which makes it difficult to assess the impact my changes on production. Does anyone have any tips on how to deal with this? Or is it just a case of fixing unexpected behaviour once we've discovered it on production? I do have a staging database with production volumes, but again I have to go through a third party to run queries against this, which is painful. So I'm looking for ways to cut out the middle man as much as possible. All this is using Oracle 9i. Thanks.

    Read the article

  • MSSQL in ASP.NET application getting unstable after a certain period

    - by Barslett
    Hello, I have an ASP.NET 2.0 application that I made to keep track of disruption reports about our public transport system. The architecture is pretty straight-forward; an MSSQL Express 2008 database, ADO.NET and a DataSet/DAL with a few methods to access the database. There is a set of .aspx pages for the UI in use by our dispatchers and on our public website, as well as a set of SOAP and REST webservices and an RSS feed. Everything worked just fine for more than a year, until two weeks ago. Now and then, it seems as the database enters an unstable mode, and the application starts responding something right, something wrong. The typical error is that apparently, an empty DataTable is returned to the public disruption overview or to the RSS generator. Thus, the user gets e.g. an empty GridView, but no exception is thrown AFAIK, and nothing is written to the Windows Application log. After a restart of the MSSQL Express service, the situation is back to normal. It has to be said that the situation first time appeared a few days after I made a new minor upgrade of the application. The RSS generator was slightly rewritten, and I added a WCF REST service. But the DAL and the database schema were untouched... Any hints of how we could keep the database stable? It is a bit annoying not to have a clue ;-)

    Read the article

  • Ordering by a max or a min from another table

    - by Paul Tomblin
    I have a table that consists of a unique id, and a few other attributes. It holds "schedules". Then I have another table that holds a list of all the times each schedule has or will "fire". This isn't the exact schema, but it's close: create table schedule ( id varchar(40) primary key, attr1 int, attr2 varchar(20) ); create table schedule_times ( id varchar(40) foreign key schedule(id), fire_date date ); I want to query the schedule table, getting the attributes and the next and previous fire_dates, in Java, sometimes ordering on one of the attributes, but sometimes ordering on either previous fire_date or the next fire_date. Ordering by the attributes is easy, I just stick an "order by" into the string while I'm building my prepared statement. I'm not even sure how to go about selecting the last fire_date and the next one in a single query - I know that I can find the next fire_date for a given id by doing a SELECT min(fire_date) FROM schedule_times WHERE id = ? AND fire_date > sysdate; and the similar thing for previous fire_date using max() and fire_date < sysdate. I'm just drawing a blank on how to incorporate that into a single select from the schedule so I can get both next and previous fire_date in one shot, and also how to order by either of those attributes.

    Read the article

  • Saving Abstract and Sub classes to database

    - by bretddog
    Hi, I have an abstract class "StrategyBase", and a set of sub classes, StrategyA/B/C etc. The sub classes use some of the properties of the base class, and have some individual properties. My question is how to save this to a database. I'm currently using SqlCE, and Linq-To-Sql by creating entity classes automatically with SqlMetal.exe. I've seen there are three solutions shown in this question, but I'm not able to see how these solutions will work or not with SqlMetal/entity classes. Though it seems to me the "concrete table inheritance" would probably work without any manual modifying. What about the other two, would they be problematic? For "Single Table Inheritance" wouldn't all classes get all variables, even though they don't need them? And for the "Class table inheritance" solution I can't really see at all how that will map into the entity-classes for a useful purpose. I may note that I extend these partial entity classes for making the classes of my business objects. I may also consider moving to EntityFramework instead of SqlMetal/Linq2Sql, so would be nice also to know if that makes any difference to what schema is easy to implement. One likely important thing to note is that I will constantly be develop new strategies, which makes me have to modify the program code, and probably the database shcema; when adding a new strategy. Sorry the question is a bit "all over the place", but hopefully it's some clear advantages/disadvantages here that you may be able to advice. ? Cheers!

    Read the article

  • Implementing a 1 to many relationship with SQLite

    - by Patrick
    I have the following schema implemented successfully in my application. The application connects desk unit channels to IO unit channels. The DeskUnits and IOUnits tables are basically just a list of desk/IO units and the number of channels on each. For example a desk could be 4 or 12 channel. CREATE TABLE DeskUnits (Name TEXT, NumChannels NUMERIC); CREATE TABLE IOUnits (Name TEXT, NumChannels NUMERIC); CREATE TABLE RoutingTable (DeskUnitName TEXT, DeskUnitChannel NUMERIC, IOUnitName TEXT, IOUnitChannel NUMERIC); The RoutingTable 'table' then connects each DeskUnit channel to an IOUnit channel. For example the DeskUnit called "Desk1" channel 1 may route to IOunit name "IOUnit1" channel 2, etc. So far I hope this is pretty straightforward and understandable. The problem is, however, this is a strictly 1 to 1 relationship. Any DeskUnit channel can route to only 1 IOUnit channel. Now, I need to implement a 1 to many relationship. Where any DeskUnit channel can connect to multiple IOUnit channels. I realise I may have to rearrange the tables completely, but I am not sure the best way to go about this. I am fairly new to SQLite and databases in general so any help would be appreciated. Thanks Patrick

    Read the article

  • database setup for web application

    - by vbNewbie
    I have an application that requires a database and I have already setup tables but not sure if they match the requirements of the app. The app is a crawler which fetches web urls, crawls and stores appropriate urls and posts and all this is based on client requests which are stored as projects. So for each url stored there is one post and for client there are many projects and for each project there are many types of requests. So we get a client with a request and assign them a project name and then use the request to search for content and store the url and post. A request could already exist and should not be duplicated but should be associated with the right client and project and post etc. Here is my schema now: url table: urlId PK queryId FK url post table: postId PK urlId FK post date request table: queryId PK request client table: clientId PK client Name projectId FK project table: projectID PK queryID FK project Does this look right? or does anyone have suggestions. Of course my stored procedures and insert statements will have to be in depth.

    Read the article

  • Performance impact when using XML columns in a table with MS SQL 2008

    - by Sam Dahan
    I am using a simple table with 6 columns, 3 of which are of XML type, not schema-constrained. When the table reaches a size around 120,000 or 150,000 rows, I see a dramatic performance cost in doing any query in the table. For comparison, I have another table, which grows in size at about the same rate, but only contain scalar types (int, datetime, a few float columns). That table performs perfectly fine even after 200,000 rows. And by the way, I am not using XQuery on the xml columns, i am only using regular SQL query statements. Some specifics: both tables contain a DateTime field called SampleTime. a statement like (it's in a stored procedure but I show you the actual statement) SELECT MAX(sampleTime) SampleTime FROM dbo.MyRecords WHERE PlacementID=@somenumber takes 0 seconds on the table without xml columns, and anything from 13 to 20 seconds on the table with XML columns. That depends on which drive I set my database on. At the moment it sits on a different spindle (not C:) and it takes 13 seconds. Has anyone seen this behavior before, or have any hint at what I am doing wrong? I tried this with SQL 2008 EXPRESS and the full-blown SQL Server 2008, that made no difference. Oh, one last detail: I am doing this from a C# application, .NET 3.5, using SqlConnection, SqlReader, etc.. I'd appreciate some insight into that, thanks! Sam

    Read the article

  • How to use Nhibernate Validator + NHib component + ddl

    - by mynkow
    I just configured my NHibValidator. My NHibernate creates the DB schema. When I set MaxLenght="20" to some property of a class then in the database the length appears in the database column. I am doing this in the NHibValidator xml file. But the problem is that I have components and cannot figure out how to achieve this behaviour. The component is configured correctly in the Customer.hbm.xml file. EDIT: Well, I found that Hibernate Validator users had the same problem two years ago. http://opensource.atlassian.com/projects/hibernate/browse/HV-25 Is this an issue for NHibernate Validator or it is fixed. If it is working tell me how please. ----------------------------------------------------- public class Customer { public virtual string Name{get;set;} public virtual Contact Contacts{ get; } } ----------------------------------------------------- public class Contact { public virtual string Address{get;set;} } ----------------------------------------------------- <?xml version="1.0" encoding="utf-8" ?> <nhv-mapping xmlns="urn:nhibernate-validator-1.0" namespace="MyNamespace" assembly="MyAssembly"> <class name="Customer"> <property name="Name"> <length max="20"/> </property> <property name="Contacts"> <notNull/> <valid/> </property> </class> </nhv-mapping> ----------------------------------------------------- <?xml version="1.0" encoding="utf-8" ?> <nhv-mapping xmlns="urn:nhibernate-validator-1.0" namespace="MyNamespace" assembly="MyAssembly"> <class name="Contact"> <property name="Address"> <length max="50"/> <valid/> </property> </class> </nhv-mapping> -----------------------------------------------------

    Read the article

  • Getting wierd issue with TO_NUMBER function in Oracle

    - by Fazal
    I have been getting an intermittent issue when executing to_number function in the where clause on a varchar2 column if number of records exceed a certain number n. I used n as there is no exact number of records on which it happens. On one DB it happens after n was 1 million on another when it was 0.1. million. E.g. I have a table with 10 million records say Table Country which has field1 varchar2 containing numberic data and Id If I do a query as an example select * from country where to_number(field1) = 23 and id 1 and id < 100000 This works But if i do the query select * from country where to_number(field1) = 23 and id 1 and id < 100001 It fails saying invalid number Next I try the query select * from country where to_number(field1) = 23 and id 2 and id < 100001 It works again As I only got invalid number it was confusing, but in the log file it said Memory Notification: Library Cache Object loaded into SGA Heap size 3823K exceeds notification threshold (2048K) KGL object name :with sqlplan as ( select c006 object_owner, c007 object_type,c008 object_name from htmldb_collections where COLLECTION_NAME='HTMLDB_QUERY_PLAN' and c007 in ('TABLE','INDEX','MATERIALIZED VIEW','INDEX (UNIQUE)')), ws_schemas as( select schema from wwv_flow_company_schemas where security_group_id = :flow_security_group_id), t as( select s.object_owner table_owner,s.object_name table_name, d.OBJECT_ID from sqlplan s,sys.dba_objects d It seems its related to SGA size, but google did not give me much help on this. Does anyone have any idea about this issue with TO_NUMBER or oracle functions for large data?

    Read the article

  • Fluent NHibernate - Delete a related object when no explicit relationship exists in the model

    - by John Price
    I've got an application that keeps track of (for the sake of an example) what drinks are available at a given restaurant. My domain model looks like: class Restaurant { public IEnumerable<RestaurantDrink> GetRestaurantDrinks() { ... } //other various properties } class RestaurantDrink { public Restaurant Restaurant { get; set; } public Drink { get; set; } public string DrinkVariety { get; set; } //"fountain drink", "bottled", etc. //other various properties } class Drink { public string Name { get; set; } public string Manufacturer { get; set; } //other various properties } My db schema is (I hope) about what you'd expect; "RestaurantDrinks" is essentially a mapping table between Restaurants and Drinks with some extra properties (like "DrinkVariety" tacked on). Using Fluent NHibernate to set up mappings, I've set up a "HasMany" relationship from Restaurants to RestaurantDrinks that causes the latter to be deleted when its parent Restaurant is deleted. My question is, given that "Drink" does not have any property on it that explicitly references RestaurantDrinks (the relationship only exists in the underlying database), can I set up a mapping that will cause RestaurantDrinks to be deleted if their associated Drink is deleted?

    Read the article

  • Alternatives to the Entity Framework for Serving/Consuming an OData Interface

    - by Egahn
    I'm researching how to set up an OData interface to our database. I would like to be able to pull/query data from our DB into Excel, as a start. Eventually I would like to have Excel run queries and pull data over HTTP from a remote client, including authentication, etc. I've set up a working (rickety) prototype so far, using the ADO.NET Entity Data Model wizard in Visual Studio, and VSTO to create a test Excel worksheet with a button to pull from that ADO.NET interface. This works OK so far, and I can query the DB using Linq through the entities/objects that are created by the ADO.NET EDM wizard. However, I have started to run into some problems with this approach. I've been finding the Entity Framework difficult to work with (and in fact, also difficult to research solutions to, as there's a lot of chaff out there regarding it and older versions of it). An example of this is my being unable to figure out how to set the SQL command timeout (as opposed to the HTTP request timeout) on the DataServiceContext object that the wizard generates for my schema, but that's not the point of my question. The real question I have is, if I want to use OData as my interface standard, am I stuck with the Entity Framework? Are there any other solutions out there (preferably open source) which can set up, serve and consume an OData interface, and are easier to work with and less bloated than the Entity Framework? I have seen mention of NHibernate as an alternative, but most of the comparison threads I've seen are a few years old. Are there any other alternatives out there now? Thanks very much!

    Read the article

  • Alter Dilemma : How to use to set Primary and other attributes.

    - by Rachel
    I have following table in database AND I need to alter it to below mentioned schema. Initially I was drop the current database and creating new one using the create but I am not supposed to do that and use ALTER but am not sure as to how can I use ALTER to add primary key and other constraints. Any Suggestions !!! Code Current: CREATE TABLE `details` ( `KEY` varchar(255) NOT NULL, `ID` bigint(20) NOT NULL, `CODE` varchar(255) NOT NULL, `C_ID` bigint(20) NOT NULL, `C_CODE` varchar(64) NOT NULL, `CCODE` varchar(255) NOT NULL, `TCODE` varchar(255) NOT NULL, `LCODE` varchar(255) NOT NULL, `CAMCODE` varchar(255) NOT NULL, `OFCODE` varchar(255) NOT NULL, `OFNAME` varchar(255) NOT NULL, `PRIORITY` bigint(20) NOT NULL, `STDATE` datetime NOT NULL, `ENDATE` datetime NOT NULL, `INT` varchar(255) NOT NULL, `PHONE` varchar(255) NOT NULL, `TV` varchar(255) NOT NULL, `MTV` varchar(255) NOT NULL, `TYPE` varchar(255) NOT NULL, `CREATED` datetime NOT NULL, `MAIN` varchar(255) NOT NULL ) ENGINE=MyISAM DEFAULT CHARSET=latin1; Desired: CREATE TABLE `details` ( `id` bigint(20) NOT NULL, `code` varchar(255) NOT NULL, `cid` bigint(20) NOT NULL, `ccode` varchar(64) NOT NULL, `c_code` varchar(255) NOT NULL, `tcode` varchar(255) NOT NULL, `lcode` varchar(255) NOT NULL, `camcode` varchar(255) NOT NULL, `ofcode` varchar(255) NOT NULL, `ofname` varchar(255) NOT NULL, `priority` bigint(20) NOT NULL, `stdate` datetime NOT NULL, `enddate` datetime NOT NULL, `list` varchar(255) NOT NULL, `name` varchar(255) NOT NULL, `created` datetime NOT NULL, `date` datetime NOT NULL, `ofshn` int(20) NOT NULL, `ofcl` int(20) NOT NULL, `ofr` int(20) NOT NULL, PRIMARY KEY (`code`,`ccode`,`list`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1; Thanks !!!

    Read the article

  • How to check if two records have a self-referencing relation?

    - by Machine
    Consider the following schema with users and their collegues (friends): Users User: columns: user_id: name: user_id as userId type: integer(8) unsigned: 1 primary: true autoincrement: true first_name: name: first_name as firstName type: string(45) notnull: true last_name: name: last_name as lastName type: string(45) notnull: true email: type: string(45) notnull: true unique: true relations: Collegues: class: User local: invitor foreign: invitee refClass: CollegueStatus equal: true onDelete: CASCADE onUpdate: CASCADE Join table: CollegueStatus: columns: invitor: type: integer(8) unsigned: 1 primary: true invitee: type: integer(8) unsigned: 1 primary: true status: type: enum(8) values: [pending, accepted, denied] default: pending notnull: true Now, let's say I two records, one for the user making a HTTP request (the logged in user), and one record for a user he wants to send a message to. I want to check if these users are collegues. Questions: Does Doctrine have any pre-build functionality to check if two records with with self-relations are related? If not, how would you write a method to check this? Where would you put said method? (In the User-class, UserTable-class etc) I could probably do something like this: public function (User $user1, User $user2) { // Ensure we load collegues if $user1 was fetched with DQL that // doesn't load this relation $collegues = $user1->get('Collegues'); $areCollegues = false; foreach($collegues as $collegue) { if($collegue['userId'] === $user2['userId']) { $areCollegues = true; break; } } return $areCollegues; } But this looks a neither efficient nor pretty. I just feel that it should be solved already for self-referencing relations to be nice to use.

    Read the article

  • how to generate unique numbers less than 8 characters long.

    - by loudiyimo
    hi I want to generate unique id's everytime i call methode generateCustumerId(). The generated id must be 8 characters long or less than 8 characters. This requirement is necessary because I need to store it in a data file and schema is determined to be 8 characters long for this id. Option 1 works fine. Instead of option 1, I want to use UUID. The problem is that UUID generates an id which has to many characters. Does someone know how to generate a unique id which is less then 99999999? option 1 import java.util.HashSet; import java.util.Random; import java.util.Set; public class CustomerIdGenerator { private static Set<String> customerIds = new HashSet<String>(); private static Random random = new Random(); // XXX: replace with java.util.UUID public static String generateCustumerId() { String customerId = null; while (customerId == null || customerIds.contains(customerId)) { customerId = String.valueOf(random.nextInt(89999999) + 10000000); } customerIds.add(customerId); return customerId; } } option2 generates an unique id which is too long public static String generateCustumerId() { String ownerId = UUID.randomUUID().toString(); System.out.println("ownerId " + ownerId); return ownerId }

    Read the article

  • How to determine one to many column name from entity type

    - by snicker
    I need a way to determine the name of the column used by NHibernate to join one-to-many collections from the collected entity's type. I need to be able to determine this at runtime. Here is an example: I have some entities: namespace Entities { public class Stable { public virtual int Id {get; set;} public virtual string StableName {get; set;} public virtual IList<Pony> Ponies { get; set; } } public class Dude { public virtual int Id { get; set; } public virtual string DudesName { get; set; } public virtual IList<Pony> PoniesThatBelongToDude { get; set; } } public class Pony { public virtual int Id {get; set;} public virtual string Name {get; set;} public virtual string Color { get; set; } } } I am using NHibernate to generate the database schema, which comes out looking like this: create table "Stable" (Id integer, StableName TEXT, primary key (Id)) create table "Dude" (Id integer, DudesName TEXT, primary key (Id)) create table "Pony" (Id integer, Name TEXT, Color TEXT, Stable_id INTEGER, Dude_id INTEGER, primary key (Id)) Given that I have a Pony entity in my code, I need to be able to find out: A. Does Pony even belong to a collection in the mapping? B. If it does, what are the column names in the database table that pertain to collections In the above instance, I would like to see that Pony has two collection columns, Stable_id and Dude_id.

    Read the article

  • Problem using SQLDataReader with Sybase ASE

    - by John K.
    We're developing a reporting application that uses asp.net-mvc (.net 4). We connect through DDTEK.Sybase middleware to a Sybase ASE 12.5 database. We're having a problem pulling data into a datareader (from a stored procedure). The stored procedure computes values (approximately 50 columns) by doing sums, counts, and calling other stored procedures. The problem we're experiencing is... certain (maybe 5% of the columns) come back with NULL or 0. If we debug and copy the SQL statement being used for the datareader and run it inside another SQL tool we get all valid values for all columns. conn = new SybaseConnection { ConnectionString = ConfigurationManager.ConnectionStrings[ConnectStringName].ToString() }; conn.Open(); cmd = new SybaseCommand { CommandTimeout = cmdTimeout, Connection = conn, CommandText = mainSql }; reader = cmd.ExecuteReader(); // AT THIS POINT IMMEDIATELY AFTER THE EXECUTEREADER COMMAND // THE READER CONTAINS THE BAD (NULL OR 0) DATA FOR THESE COLUMNS. DataTable schemaTable = reader.GetSchemaTable(); // AT THIS POINT WE CAN VIEW THE DATATABLE FOR THE SCHEMA AND IT APPEARS CORRECT // THE COLUMNS THAT DON'T WORK HAVE SPECIFICATIONS IDENTICAL TO THE COLUMNS THAT DO WORK Has anyone had problems like this using Sybase and ADO? Thanks, John K.

    Read the article

  • How can I have sub-elements of a complex/mixed type with unrestricted order and count?

    - by mbmcavoy
    I am working with XML where some elements will contain text with additional markup. This is similar to this example at W3Schools. However, I need the markup tags to be able to appear in any order and possibly more than once. To modify their example for illustration: <letter> Dear Mr.<name>John Smith</name>. Your order <orderid>1032</orderid> will be shipped on <shipdate>2001-07-13</shipdate>. Thank you, <name>Bob Adams</name> </letter> None of the options presented by W3Schools (on the page following the linked example) allow this XML due to the second <name> element. Their explanation of the "indicators" and my testing are consistent. <xs:sequence> - violates the element order <xs:choice> - more than one kind of element is used. <xs:all> - maxOccurs is restricted to "1". This seems like it should be basic, after all, XHTML allows such things. How do I define my schema to allow this?

    Read the article

  • advanced winform framework

    - by alfredo dobrekk
    Hi, i m starting a new project that would basically take input from user and save them to database among about 30 screens, and i would like to find a framework that will allow the maximum number of these features out of the box : .net c#. windows form. unit testing continuous integration screens with lists, combo boxes, text boxes, add, delete, save, cancel that are easy to update when you add a property to your classes or a field to your database. auto completion on controls to help user find its way use of an orm like nhibernate easy multithreading and display of wait screens for user easy undo redo tabbed child windows search forms ability to grant access to some functionnalities according to user profiles mvp/mvvm or whatever design patterns either some code generation from database to c# classe or generation of database schema from c# classes some kind of database versioning / upgrade to easily update database when i release patches to application once in production code metrics analysis some code generator i can use against my entities that would generate some rough form i can rearrange after code documentation generator ... Any ideas ? I know its lot but i really would like to use existing code to build upon so i can focus on business rules. Do u have any suggestion to add to the list before starting ? What open source tools would u use to achieve these ?

    Read the article

  • Using twig variable to dynamically call an imported macro sub-function

    - by Chausser
    I am attempting if use a variable to call a specific macro name. I have a macros file that is being imported {% import 'form-elements.html.twig' as forms %} Now in that file there are all the form element macros: text, textarea, select, radio etc. I have an array variable that gets passed in that has an elements in it: $elements = array( array( 'type'=>'text, 'value'=>'some value', 'atts'=>null, ), array( 'type'=>'text, 'value'=>'some other value', 'atts'=>null, ), ); {{ elements }} what im trying to do is generate those elements from the macros. they work just fine when called by name: {{ forms.text(element.0.name,element.0.value,element.0.atts) }} However what i want to do is something like this: {% for element in elements %} {{ forms[element.type](element.name,element.value,element.atts) }} {% endfor %} I have tried the following all resulting in the same error: {{ forms["'"..element.type.."'"](element.name,element.value,element.atts) }} {{ forms.(element.type)(element.name,element.value,element.atts) }} {{ forms.{element.type}(element.name,element.value,element.atts) }} This unfortunately throws the following error: Fatal error: Uncaught exception 'LogicException' with message 'Attribute "value" does not exist for Node "Twig_Node_Expression_GetAttr".' in Twig\Environment.php on line 541 Any help or advice on a solution or a better schema to use would be very helpful.

    Read the article

  • myXmlDataDoc.DataSet.ReadXml problem

    - by alex
    Hi: I am using myXmlDataDoc.DataSet.ReadXml(xml_file_name, XmlReadMode.InferSchema); to populate the tables within the dataset created by reading in a xml schema using: myStreamReader = new StreamReader(xsd_file_name); myXmlDataDoc.DataSet.ReadXmlSchema(myStreamReader); The problems I am facing is when it comes to read the xml tag: The readXml function puts the element in a different table and everything in the tables are 0s. Here is the print out of the datatable: TableName: test_data 100 2 1 TableName: parameters 1 0 2 0 3 0 4 0 5 0 6 0 7 0 8 0 9 0 10 0 11 0 12 0 In my xsd file, I represent an array using this: <xs:element name="test_data"> <xs:complexType> <xs:complexContent> <xs:extension base="test_base"> <xs:sequence> <xs:element name="a" type="xs:unsignedShort"/> <xs:element name="b" type="xs:unsignedShort"/> <xs:element name="c" type="xs:unsignedInt"/> <xs:element name="parameters" minOccurs="0" maxOccurs="12" type="xs:unsignedInt"/> </xs:sequence> </xs:extension> </xs:complexContent> </xs:complexType> And here is my xml file: 100 2 1 1 2 3 4 5 6 7 8 9 10 11 12 I am expecting after the readXml function call, the "parameters" is part of the test_data table. TableName: test_data 100 2 1 1 2 3 4 5 6 7 8 9 10 11 12 Does anyone knows what's wrong? Thanks

    Read the article

  • how to Solve the "Digg" problem in MongoDB

    - by user193116
    A while back,a Digg developer had posted this blog ,"http://about.digg.com/blog/looking-future-cassandra", where the he described one of the issues that were not optimally solved in MySQL. This was cited as one of the reasons for their move to Cassandra. I have been playing with MongoDB and I would like to understand how to implement the MongoDB collections for this problem From the article, the schema for this information in MySQL : CREATE TABLE Diggs ( id INT(11), itemid INT(11), userid INT(11), digdate DATETIME, PRIMARY KEY (id), KEY user (userid), KEY item (itemid) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; CREATE TABLE Friends ( id INT(10) AUTO_INCREMENT, userid INT(10), username VARCHAR(15), friendid INT(10), friendname VARCHAR(15), mutual TINYINT(1), date_created DATETIME, PRIMARY KEY (id), UNIQUE KEY Friend_unique (userid,friendid), KEY Friend_friend (friendid) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; This problem is ubiquitous in social networking scenario implementation. People befriend a lot of people and they in turn digg a lot of things. Quickly showing a user what his/her friends are up to is very critical. I understand that several blogs have since then provided a pure RDBMs solution with indexes for this issue; however I am curious as to how this could be solved in MongoDB.

    Read the article

  • Populate an unmapped property of domain object from result of join with Nhibernate

    - by Adam Pope
    I have a situation where I have 3 tables: StockItem, Office and StockItemPrice. The price for each StockItem can be different for each Office. StockItem( ID Name ) Office( ID Name ) StockItemPrice( ID StockItemID OfficeID Price ) I've set up a schema with 2 many-to-one relations to link StockItem and Office. So in my StockItem domain object I have a property: IList<StockItemPrice> Prices; which gets loaded with the price of the item for each office. That's working fine. Now I'm trying to get the price of an item for a single office. I have the following Criteria query: NHibernateSession.CreateCriteria(persistentType) .Add(Restrictions.Eq("ID", id)) .CreateAlias("Prices", "StockItemPrice") .Add(Restrictions.Eq("StockItemPrice.Office", office)) .UniqueResult<StockItem>(); This appears to work fine as the SQL it generates is what I qould expect. However, I dont know if it populates StockItem.Prices with a single object correctly as as soon as I reference that property NHibernate performs a lazy load of all the office's prices. Also, even if it does work, it feels really crufty having to access the price by using: mystockitem.Prices[0].Price What I would really like is to have a Price field on the StockItem object and have the price of the item put into that field by NHibernate. I've tried adding .CreateCriteria("Price", "StockItemPrice.Price") and the same with CreateAlias, but I get the error NHibernate.QueryException : could not resolve property: Price of: StockItem which makes sense I guess as Price isn't a mapped property. How would I adjust the query to make this possible?

    Read the article

  • Database abstraction/adapters for ruby

    - by Stiivi
    What are the database abstractions/adapters you are using in Ruby? I am mainly interested in data oriented features, not in those with object mapping (like active record or data mapper). I am currently using Sequel. Are there any other options? I am mostly interested in: simple, clean and non-ambiguous API data selection (obviously), filtering and aggregation raw value selection without field mapping: SELECT col1, col2, col3 = [val1, val2, val3] not hash of { :col1 = val1 ...} API takes into account table schemas 'some_schema.some_table' in a consistent (and working) way; also reflection for this (get schema from table) database reflection: get list of table columns, their database storage types and perhaps adaptor's abstracted types table creation, deletion be able to work with other tables (insert, update) in a loop enumerating selection from another table without requiring to fetch all records from table being enumerated Purpose is to manipulate data with unknown structure at the time of writing code, which is the opposite to object mapping where structure or most of the structure is usually well known. I do not need the object mapping overhead. What are the options, including back-ends for object-mapping libraries?

    Read the article

< Previous Page | 397 398 399 400 401 402 403 404 405 406 407 408  | Next Page >