Search Results

Search found 32789 results on 1312 pages for 'object relational mapping'.

Page 19/1312 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Hibernate - H2 db -- Could not parse mapping document from resource

    - by user1849757
    * Each of below files are in same location * Error : SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. org.hibernate.InvalidMappingException: Could not parse mapping document from resource ./employee.hbm.xml at org.hibernate.cfg.Configuration.addResource(Configuration.java:616) at org.hibernate.cfg.Configuration.parseMappingElement(Configuration.java:1635) at org.hibernate.cfg.Configuration.parseSessionFactory(Configuration.java:1603) at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:1582) at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:1556) at org.hibernate.cfg.Configuration.configure(Configuration.java:1476) at org.hibernate.cfg.Configuration.configure(Configuration.java:1462) at com.yahoo.hibernatelearning.FirstExample.main(FirstExample.java:19) Caused by: org.hibernate.InvalidMappingException: Could not parse mapping document from input stream at org.hibernate.cfg.Configuration.addInputStream(Configuration.java:555) at org.hibernate.cfg.Configuration.addResource(Configuration.java:613) ... 7 more Caused by: org.dom4j.DocumentException: http://hibernate.sourceforge.net/%0Ahibernate-mapping-3.0.dtd Nested exception: http://hibernate.sourceforge.net/%0Ahibernate-mapping-3.0.dtd at org.dom4j.io.SAXReader.read(SAXReader.java:484) at org.hibernate.cfg.Configuration.addInputStream(Configuration.java:546) ... 8 more Exception in thread "main" java.lang.NullPointerException at com.yahoo.hibernatelearning.FirstExample.main(FirstExample.java:33) Hibernate Config: hibernate.cfg.xml <?xml version='1.0' encoding='utf-8'?> <!DOCTYPE hibernate-configuration PUBLIC "-//Hibernate/Hibernate Configuration DTD//EN" "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd"> <hibernate-configuration> <session-factory> <property name="hibernate.connection.driver_class">org.h2.Driver</property> <property name="hibernate.connection.url">jdbc:h2:./db/repository</property> <property name="hibernate.connection.username">sa</property> <property name="hibernate.connection.password"></property> <property name="hibernate.default_schema">PUBLIC</property> <property name="hibernate.dialect">org.hibernate.dialect.H2Dialect</property> <property name="hibernate.show_sql">true</property> <property name="hibernate.hbm2ddl.auto">update</property> <!-- Mapping files --> <mapping resource="./employee.hbm.xml"/> </session-factory> </hibernate-configuration> Mapping Config: employee.hbm.xml <?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://hibernate.sourceforge.net/ hibernate-mapping-3.0.dtd"> <hibernate-mapping> <class name="com.yahoo.hibernatelearning.Employee" table="employee"> <id name="empId" type="int" column="emp_id" > <generator class="native"/> </id> <property name="empName"> <column name="emp_name" /> </property> <property name="empSal"> <column name="emp_sal" /> </property> </class> </hibernate-mapping>

    Read the article

  • Avoiding Object Oriented Pitfalls, Migrating from C, What Worked for You?

    - by Stephen
    I've been programming in procedural languages for quite some time now, and my first reaction to a problem is to start breaking it down into tasks to perform rather than to consider the different entities (objects) that exist and their relationships. I have had a university course in OOP, and understand the fundamentals of encapsulation, data abstraction, polymorphism, modularity and inheritance. I read Learning to think in the Object Oriented Way and Learning object oriented thinking, and will be looking at some of the books pointed to in those answers. I think that several of my medium to large sized projects will benefit from effective use of OOP but as a novice I would like to avoid time consuming, common errors. Based on your experiences, what are these pitfalls and what are reasonable ways around them? If you could explain why they are pitfalls, and how your suggestion is effective in addressing the issue it'd be appreciated. I'm thinking along the lines of something like "Is it common to have a fair number of observer and modifier methods and use private variables or are there techniques for consolidating/reducing them?" I'm not worried about using C++ as a pure OO language, if there are good reasons to mix methods. (Reminiscent of the reasons to use GOTOs, albeit sparingly.) Thank you!

    Read the article

  • DB Object passing between classes singleton, static or other?

    - by Stephen
    So I'm designing a reporting system at work it's my first project written OOP and I'm stuck on the design choice for the DB class. Obviously I only want to create one instance of the DB class per-session/user and then pass it to each of the classes that need it. What I don't know it what's best practice for implementing this. Currently I have code like the following:- class db { private $user = 'USER'; private $pass = 'PASS'; private $tables = array( 'user','report', 'etc...'); function __construct(){ //SET UP CONNECTION AND TABLES } }; class report{ function __construct ($params = array(), $db, $user) { //Error checking/handling trimed //$db is the database object we created $this->db = $db; //$this->user is the user object for the logged in user $this->user = $user; $this->reportCreate(); } public function setPermission($permissionId = 1) { //Note the $this->db is this the best practise solution? $this->db->permission->find($permissionId) //Note the $this->user is this the best practise solution? $this->user->checkPermission(1) $data=array(); $this->db->reportpermission->insert($data) } };//end report I've been reading about using static classes and have just come across Singletons (though these appear to be passé already?) so what's current best practice for doing this?

    Read the article

  • redoing object model construction to fit with asynchronous data fetching

    - by Andrew Patterson
    I have a modeled a set of objects that correspond with some real world concepts. TradeDrug, GenericDrug, TradePackage, DrugForm Underlying the simple object model I am trying to provide is a complex medical terminology that uses numeric codes to represent relationships and concepts, all accessible via a REST service - I am trying to hide away some of that complexity with an object wrapper. To give a concrete example I can call TradeDrug d = Searcher.FindTradeDrug("Zoloft") or TradeDrug d = new TradeDrug(34) where 34 might be the code for Zoloft. This will consult a remote server to find out some details about Zoloft. I might then call GenericDrug generic = d.EquivalentGeneric() System.Out.WriteLine(generic.ActiveIngredient().Name) in order to get back the generic drug sertraline as an object (again via a background REST call to the remote server that has all these drug details), and then perhaps find its ingredient. This model works fine and is being used in some applications that involve data processing. Recently however I wanted to do a silverlight application that used and displayed these objects. The silverlight environment only allows asynchronous REST/web service calls. I have no problems with how to make the asychhronous calls - but I am having trouble with what the design should be for my object construction. Currently the constructors for my objects do some REST calls sychronously. public TradeDrug(int code) { form = restclient.FetchForm(code) name = restclient.FetchName(code) etc.. } If I have to use async 'events' or 'actions' in order to use the Silverlight web client (I know silverlight can be forced to be a synchronous client but I am interested in asychronous approaches), does anyone have an guidance or best practice for how to structure my objects. I can pass in an action callback to the constructor public TradeDrug(int code, Action<TradeDrug> constructCompleted) { } but this then allows the user to have a TradeDrug object instance before what I want to construct is actually finished. It also doesn't support an 'event' async pattern because the object doesn't exist to add the event to until it is constructed. Extending that approach might be a factory object that itself has an asynchronous interface to objects factory.GetTradeDrugAsync(code, completedaction) or with a GetTradeDrugCompleted event? Does anyone have any recommendations?

    Read the article

  • Blob object not working properly even though the class is seralized

    - by GustlyWind
    I have class which is seralized and does convert a very large amount of data object to blob to save it to database.In the same class there is decode method to convert blob to the actual object.Following is the code for encode and decode of the object. private byte[] encode(ScheduledReport schedSTDReport) { byte[] bytes = null; try { ByteArrayOutputStream bos = new ByteArrayOutputStream(); ObjectOutputStream oos = new ObjectOutputStream(bos); oos.writeObject(schedSTDReport); oos.flush(); oos.close(); bos.close(); //byte [] data = bos.toByteArray(); //ByteArrayOutputStream baos = new ByteArrayOutputStream(); //GZIPOutputStream out = new GZIPOutputStream(baos); //XMLEncoder encoder = new XMLEncoder(out); //encoder.writeObject(schedSTDReport); //encoder.close(); bytes = bos.toByteArray(); //GZIPOutputStream out = new GZIPOutputStream(bos); //out.write(bytes); //bytes = bos.toByteArray(); } catch (Exception e) { _log.error("Exception caught while encoding/zipping Scheduled STDReport", e); } decode(bytes); return bytes; } /* * Decode the report definition blob back to the * ScheduledReport object. */ private ScheduledReport decode(byte[] bytes) { ByteArrayInputStream bais = new ByteArrayInputStream(bytes); ScheduledReport sSTDR = null; try { ObjectInputStream ois = new ObjectInputStream(bais); //GZIPInputStream in = new GZIPInputStream(bais); //XMLDecoder decoder = new XMLDecoder(in); sSTDR = (ScheduledReport)ois.readObject();//decoder.readObject(); //decoder.close(); } catch (Exception e) { _log.error("IOException caught while decoding/unzipping Scheduled STDReport", e); } return sSTDR; } The problem here is whenver I change something else in this class means any other method,a new class version is created and so the new version the class is unable to decode the originally encoded blob object. The object which I am passing for encode is also seralized object but this problem exists. Any ideas thanks

    Read the article

  • Encapsulate update method inside of object or have method which accepts an object to update

    - by Tom
    Hi, I actually have 2 questions related to each other: I have an object (class) called, say MyClass which holds data from my database. Currently I have a list of these objects ( List < MyClass ) that resides in a singleton in a "communal area". I feel it's easier to manage the data this way and I fail to see how passing a class around from object to object is beneficial over a singleton (I would be happy if someone can tell me why). Anyway, the data may change in the database from outside my program and so I have to update the data every so often. To update the list of the MyClass I have a method called say, Update, written in another class which accepts a list of MyClass. This updates all the instances of MyClass in the list. However would it be better instead to encapulate the Update() method inside the MyClass object, so instead I would say foreach(MyClass obj in MyClassList) { obj.update(); } What is a better implementation and why? The update method requires a XML reader. I have written an XML reader class which is basically a wrapper over the standard XML reader the language natively provides which provides application specific data collection. Should the XML reader class be in anyway in the "inheritance path" of the MyClass object - the MyClass objects inherits from the XML reader because it uses a few methods. I can't see why it should. I don't like the idea of declaring an instance of the XML Reader class inside of MyClass and an MyClass object is meant to be a simple "record" from the database and I feel giving it loads of methods, other object instances is a bit messy. Perhaps my XML reader class should be static but C#'s native XMLReader isn't static.? Any comments would be greatly appreciated Thanks Thomas

    Read the article

  • Adding a JPanel to another JPanel having TableLayout

    - by user253530
    I am trying to develop a map editor in java. My map window receives as a constructor a Map object. From that map object i am able to retrieve the Grid and every item in the grid along with other getters and setters. The problem is that even though the Mapping extends JComponent, when I place it in a panel it is not painted. I have overridden the paint method to satisfy my needs. Here is the code, maybe you could help me. public class MapTest extends JFrame implements ActionListener { private JPanel mainPanel; private JPanel mapPanel; private JPanel minimapPanel; private JPanel relationPanel; private TableLayout tableLayout; private JPanel tile; MapTest(Map map) { mainPanel = (JPanel) getContentPane(); mapPanel = new JPanel(); populateMapPanel(map); mainPanel.add(mapPanel); this.setPreferredSize(new Dimension(800, 600)); this.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); this.setVisible(true); } private double[][] generateTableLayoutSize(int x, int y, int size) { double panelSize[][] = new double[x][y]; for (int i = 0; i < x; i++) { for (int j = 0; j < y; j++) { panelSize[i][j] = size; } } return panelSize; } private void populateMapPanel(Map map) { double[][] layoutSize = generateTableLayoutSize(map.getMapGrid().getRows(), map.getMapGrid().getColumns(), 50); tableLayout = new TableLayout(layoutSize); for(int i = 0; i < map.getMapGrid().getRows(); i++) { for(int j = 0; j < map.getMapGrid().getColumns(); j++) { tile = new JPanel(); tile.setName(String.valueOf(((Mapping)map.getMapGrid().getItem(i, j)).getCharacter())); tile.add(map.getMapItem(i, j)); String constraint = i + "," + j; mapPanel.add(tile, constraint); } } mapPanel.validate(); mapPanel.repaint(); } public void actionPerformed(ActionEvent e) { throw new UnsupportedOperationException("Not supported yet."); } } My Mapping Class public class Mapping extends JComponent implements Serializable{ private BufferedImage image; private Character character; //default public Mapping() { super(); this.image = null; this.character = '\u0000'; } //Mapping from image and char public Mapping(BufferedImage image, char character) { super(); this.image = image; this.character = character; } //Mapping from file and char public Mapping(File file, char character) { try { this.image = ImageIO.read(file); this.character = character; } catch (IOException ex) { System.out.println(ex); } } public char getCharacter() { return character; } public void setCharacter(char character) { this.character = character; } public BufferedImage getImage() { return image; } public void setImage(BufferedImage image) { this.image = image; repaint(); } @Override /*Two mappings are consider the same if -they have the same image OR -they have the same character OR -both of the above*/ public boolean equals(Object mapping) { if (this == mapping) { return true; } if (mapping instanceof Mapping) { return true; } //WARNING! equals might not work for images return (this.getImage()).equals(((Mapping) mapping).getImage()) || (this.getCharacter()) == (((Mapping) mapping).getCharacter()); } @Override public void paintComponent(Graphics g) { super.paintComponent(g); //g.drawImage(image, 0, 0, null); g.drawImage(image, 0, 0, this.getWidth(), this.getHeight(), null); } // @Override // public Dimension getPreferredSize() { // if (image == null) { // return new Dimension(10, 10); //instead of 100,100 set any prefered dimentions // } else { // return new Dimension(100, 100);//(image.getWidth(null), image.getHeight(null)); // } // } private void readObject(java.io.ObjectInputStream in) throws IOException, ClassNotFoundException { character = (Character) in.readObject(); image = ImageIO.read(ImageIO.createImageInputStream(in)); } private void writeObject(java.io.ObjectOutputStream out) throws IOException { out.writeObject(character); ImageWriter writer = (ImageWriter) ImageIO.getImageWritersBySuffix("jpg").next(); writer.setOutput(ImageIO.createImageOutputStream(out)); ImageWriteParam param = writer.getDefaultWriteParam(); param.setCompressionMode(ImageWriteParam.MODE_EXPLICIT); param.setCompressionQuality(0.85f); writer.write(null, new IIOImage(image, null, null), param); } }

    Read the article

  • How to implement multi-source XSLT mapping in 11g BPEL

    - by [email protected]
    In SOA 11g, you can create a XSLT mapper that uses multiple sources as the input. To implement a multi-source mapper, just follow the instructions below, Drag and drop a Transform Activity to a BPEL process Double-click on the Transform Activity, the Transform dialog window appears. Add source variables by clicking the Add icon and selecting the variable and part of the variable as needed. You can select multiple input variables. The first variable represents the main XML input to the XSL mapping, while additional variables that are added here are defined in the XSL mapping as input parameters. Select the target variable and its part if available. Specify the mapper file name, the default file name is xsl/Transformation_%SEQ%.xsl, where %SEQ% represents the sequence number of the mapper. Click OK, the xls file will be opened in the graphical mode. You can map the sources to the target as usual. Open the mapper source code, you will notice the variable representing the additional source payload, is defined as the input parameter in the map source spec and body<mapSources>    <source type="XSD">      <schema location="../xsd/po.xsd"/>      <rootElement name="PurchaseOrder" namespace="http://www.oracle.com/pcbpel/po"/>    </source>    <source type="XSD">      <schema location="../xsd/customer.xsd"/>      <rootElement name="Customer" namespace="http://www.oracle.com/pcbpel/Customer"/>      <param name="v_customer" />    </source>  </mapSources>...<xsl:param name="v_customer"/> Let's take a look at the BPEL source code used to execute xslt mapper. <assign name="Transform_1">            <bpelx:annotation>                <bpelx:pattern>transformation</bpelx:pattern>            </bpelx:annotation>            <copy>                <from expression="ora:doXSLTransformForDoc('xsl/Transformation_1.xsl',bpws:getVariableData('v_po'),'v_customer',bpws:getVariableData('v_customer'))"/>                <to variable="v_invoice"/>            </copy>        </assign> You will see BPEL uses ora:doXSLTransformForDoc XPath function to execute the XSLT mapper.This function returns the result of  XSLT transformation when the xslt template matching the document. The signature of this function is  ora:doXSLTransformForDoc(template,input, [paramQName, paramValue]*).Wheretemplate is the XSLT mapper nameinput is the string representation of xml input, paramQName is the parameter defined in the xslt mapper as the additional sourceparameterValue is the additional source payload. You can add more sources to the mapper at the later stage, but you have to modify the ora:doXSLTransformForDoc in the BPEL source code and make sure it passes correct parameter and its value pair that reflects the changes in the XSLT mapper.So the best practices are : create the variables before creating the mapping file, therefore you can add multiple sources when you define the transformation in the first place, which is more straightforward than adding them later on. Review ora:doXSLTransformForDoc code in the BPEL source and make sure it passes the correct parameters to the mapper.

    Read the article

  • BizTalk&ndash;Mapping repeating EDI segments using a Table Looping functoid

    - by Bill Osuch
    BizTalk’s HIPAA X12 schemas have several repeating date/time segments in them, where the XML winds up looking something like this: <DTM_StatementDate> <DTM01_DateTimeQualifier>232</DTM01_DateTimeQualifier> <DTM02_ClaimDate>20120301</DTM02_ClaimDate> </DTM_StatementDate> <DTM_StatementDate> <DTM01_DateTimeQualifier>233</DTM01_DateTimeQualifier> <DTM02_ClaimDate>20120302</DTM02_ClaimDate> </DTM_StatementDate> The corresponding EDI segments would look like this: DTM*232*20120301~ DTM*233*20120302~ The DateTimeQualifier element indicates whether it’s the start date or end date – 232 for start, 233 for end. So in this example (an X12 835) we’re saying the statement starts on 3/1/2012 and ends on 3/2/2012. When you’re mapping from some other data format, many times your start and end dates will be within the same node, like this: <StatementDates> <Begin>20120301</Begin> <End>20120302</End> </StatementDates> So how do you map from that and create two repeating segments in your destination map? You could connect both the <Begin> and <End> nodes to a looping functoid, and connect its output to <DTM_StatementDate>, then connect both <Begin> and <End> to <DTM_StatementDate> … this would give you two repeating segments, each with the correct date, but how to add the correct qualifier? The answer is the Table Looping Functoid! To test this, let’s create a simplified schema that just contains the date fields we’re mapping. First, create your input schema: And your output schema: Now create a map that uses these two schemas, and drag a Table Looping functoid onto it. The first input parameter configures the scope (or how many times the records will loop), so drag a link from the StatementDates node over to the functoid. Yes, StatementDates only appears once, so this would make it seem like it would only loop once, but you’ll see in just a minute. The second parameter in the functoid is the number of columns in the output table. We want to fill two fields, so just set this to 2. Now drag the Begin and End nodes over to the functoid. Finally, we want to add the constant values for DateTimeQualifier, so add a value of 232 and another of 233. When all your inputs are configured, it should look like this: Now we’ll configure the output table. Click on the Table Looping Grid, and configure it to look like this: Microsoft’s description of this functoid says “The Table Looping functoid repeats with the looping record it is connected to. Within each iteration, it loops once per row in the table looping grid, producing multiple output loops.” So here we will loop (# of <StatementDates> nodes) * (Rows in the table), or 2 times. Drag two Table Extractor functoids onto the map; these are what are going to pull the data we want out of the table. The first input to each of these will be the output of the TableLooping functoid, and the second input will be the row number to pull from. So the functoid connected to <DTM01_DateTimeQualifier> will look like this: Connect these two functoids to the two nodes we want to populate, and connect another output from the Table Looping functoid to the <DTM_StatementDate> record. You should have a map that looks something like this: Create some sample xml, use it as the TestMap Input Instance, and you should get a result like the XML at the top of this post. Technorati Tags: BizTalk, EDI, Mapping

    Read the article

  • Using Zope object unique id ( _p_oid ) to access object itself

    - by Alex M
    Every Zope object has it's own unique id ( _p_oid ). To convert it into integer value: from Shared.DC.xml.ppml import u64 as decodeObjectId oid = decodeObjectId(getattr(<Object instance>, '_p_oid')) Is it possible to get object itself having it's _p_oid? I tried this: from ZODB.utils import p64 object = <RootObject instance>._p_jar[p64(oid)] But it seems it's a wrong way

    Read the article

  • [Hibernate Mapping] relationship set between table and mapping table to use joins.

    - by Matthew De'Loughry
    Hi guys, I have two table a "Module" table and a "StaffModule" I'm wanting to display a list of modules by which staff are present on the staffmodule mapping table. I've tried from Module join Staffmodule sm with ID = sm.MID with no luck, I get the following error Path Expected for join! however I thought I had the correct join too allow this but obviously not can any one help StaffModule HBM <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd"> <!-- Generated Apr 26, 2010 9:50:23 AM by Hibernate Tools 3.2.1.GA --> <hibernate-mapping> <class name="Hibernate.Staffmodule" schema="WALK" table="STAFFMODULE"> <composite-id class="Hibernate.StaffmoduleId" name="id"> <key-many-to-one name="mid" class="Hibernate.Module"> <column name="MID"/> </key-many-to-one> <key-property name="staffid" type="int"> <column name="STAFFID"/> </key-property> </composite-id> </class> </hibernate-mapping> and Module.HBM <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd"> <!-- Generated Apr 26, 2010 9:50:23 AM by Hibernate Tools 3.2.1.GA --> <hibernate-mapping> <class name="Hibernate.Module" schema="WALK" table="MODULE"> <id name="id" type="int"> <column name="ID"/> <generator class="assigned"/> </id> <property name="modulename" type="string"> <column length="50" name="MODULENAME"/> </property> <property name="teacherid" type="int"> <column name="TEACHERID" not-null="true"/> </property> </class> hope thats enough information! and thanks in advance.

    Read the article

  • Are document-oriented databases any more suitable than relational ones for persisting objects?

    - by Owen Fraser-Green
    In terms of database usage, the last decade was the age of the ORM with hundreds competing to persist our object graphs in plain old-fashioned RMDBS. Now we seem to be witnessing the coming of age of document-oriented databases. These databases are highly optimized for schema-free documents but are also very attractive for their ability to scale out and query a cluster in parallel. Document-oriented databases also hold a couple of advantages over RDBMS's for persisting data models in object-oriented designs. As the tables are schema-free, one can store objects belonging to different classes in an inheritance hierarchy side-by-side. Also, as the domain model changes, so long as the code can cope with getting back objects from an old version of the domain classes, one can avoid having to migrate the whole database at every change. On the other hand, the performance benefits of document-oriented databases mainly appear to come about when storing deeper documents. In object-oriented terms, classes which are composed of other classes, for example, a blog post and its comments. In most of the examples of this I can come up with though, such as the blog one, the gain in read access would appear to be offset by the penalty in having to write the whole blog post "document" every time a new comment is added. It looks to me as though document-oriented databases can bring significant benefits to object-oriented systems if one takes extreme care to organize the objects in deep graphs optimized for the way the data will be read and written but this means knowing the use cases up front. In the real world, we often don't know until we actually have a live implementation we can profile. So is the case of relational vs. document-oriented databases one of swings and roundabouts? I'm interested in people's opinions and advice, in particular if anyone has built any significant applications on a document-oriented database.

    Read the article

  • The Boss Answer: What is a relational database?

    - by kce
    I'm mostly a system administrator and I don't directly work with databases other than installing them, setting up accounts, granting privileges, and so on. I realized that if The Boss walked up to me and asked, "What is a relational database?" I probably couldn't give a satisfactory answer... I'd maybe mumble something about data being stored and organized by categories which you can query with a special programing language (i.e., SQL). So could someone give a good "Boss Answer" for what a relational database is? And maybe how its different than just storing data on a file server? Bonus points for clever but accessible analogies and explaining tables, columns, records and fields. I'd define a "Boss Answer" as a quick one (maybe two) paragraph explanation for non-technical folks... mostly your Boss, on those rare occasions they actually ask you what it is you do all day.

    Read the article

  • Mapping a Vertex Buffer in DirectX11

    - by judeclarke
    I have a VertexBuffer that I am remapping on a per frame base for a bunch of quads that are constantly updated, sharing the same material\index buffer but have different width/heights. However, currently right now there is a really bad flicker on this geometry. Although it is flickering, the flicker looks correct. I know it is the vertex buffer mapping because if I recreate the entire VB then it will render fine. However, as an optimization I figured I would just remap it. Does anyone know what the problem is? The length (width, size) of the vertex buffer is always the same. One might think it is double buffering, however, it would not be double buffering because it only happens when I map/unmap the buffer, so that leads me to believe that I am setting some parameters wrong on the creation or mapping. I am using DirectX11, my initialization and remap code are: Initialization code D3D11_BUFFER_DESC bd; ZeroMemory( &bd, sizeof(bd) ); bd.Usage = D3D11_USAGE_DYNAMIC; bd.ByteWidth = vertCount * vertexTypeWidth; bd.BindFlags = D3D11_BIND_VERTEX_BUFFER; //bd.CPUAccessFlags = 0; bd.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; D3D11_SUBRESOURCE_DATA InitData; ZeroMemory( &InitData, sizeof(InitData) ); InitData.pSysMem = vertices; mVertexType = vertexType; HRESULT hResult = device->CreateBuffer( &bd, &InitData, &m_pVertexBuffer ); // This will be S_OK if(hResult != S_OK) return false; Remap code D3D11_MAPPED_SUBRESOURCE resource; HRESULT hResult = deviceContext->Map(m_pVertexBuffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &resource); // This will be S_OK if(hResult != S_OK) return false; resource.pData = vertices; deviceContext->Unmap(m_pVertexBuffer, 0);

    Read the article

  • BizTalk: mapping with Xslt

    - by Leonid Ganeline
    BizTalk Map Editor (Mapper) is a good editor, especially in the last 2010 version of the BizTalk. But still sometimes it cannot do the tasks easily. It is time for the Xslt code, It is time to remember that the maps are executed by the Xslt engine.  Right-click the Mapper Grid (a field between the source and target schemas) and choose Properties /Custom XSLT Path.  Input here a name of the file with Xslt code. Only this code will be executed, forget the picture in the Mapper, all those links and functoids.  Let’s see the real-life example. There are two source Addresses. One is on the top level and the second is inside the Member_Address record with MaxOccurs=* . The target address is placed inside the Locator record with MaxOccurs=*. The requirement is to map all source address to the one target address structure. The source Xml document looks like: The result Xml should be like this: Try to do this mapping with the Mapper and you will spent good amount of time and the result map would be tricky. If we use the Xslt code, the mapping will be simple and unambiguous, like this: Simple, elegant.

    Read the article

  • Windows Physical Direct Memory Mapping

    - by chrisjleaf
    I'm a bit disappointed there is almost no discussion of this no matter where I look so I guess I'll have to ask. I'm writing a cross platform memory bench marking application which requires direct physical address mapping rather than virtual addressing. EDIT The solution would look something like the Linux/Unix system calls: int fd = open("/dev/mem", O_RDONLY); mmap(NULL, len, PROT_READ, MAP_SHARED, fd, PHYSICAL_ADDRESS_OFFSET); which will require the kernel to either give you a virtual page mapping to the desired physical address or return that it failed. This does require supervisor privileges but that is ok. I have seen a lot of information about shared memory and memory mapped files but all of these reside on disc and are thus not really useful when I'm trying to make a system dependent read. It is very similar to writing an IO driver although I do no need write permissions to the physical address. This site gives an example of how to do it on a driver level using the Windows Driver Kit: NT Insider: Sharing Memory between drivers and applications This solution would probably require Visual Studio which currently I do not have access to. (I have downloaded the WDK api but it complained about my use of GCC for Windows). I'm traditionally a Linux programmer so I'm hoping there might be something really simple I'm missing. Thanks in advance if you know something I don't!

    Read the article

  • LLBLGen Pro v3.1 released!

    - by FransBouma
    Yesterday we released LLBLGen Pro v3.1! Version 3.1 comes with new features and enhancements, which I'll describe briefly below. v3.1 is a free upgrade for v3.x licensees. What's new / changed? Designer Extensible Import system. An extensible import system has been added to the designer to import project data from external sources. Importers are plug-ins which import project meta-data (like entity definitions, mappings and relational model data) from an external source into the loaded project. In v3.1, an importer plug-in for importing project elements from existing LLBLGen Pro v3.x project files has been included. You can use this importer to create source projects from which you import parts of models to build your actual project with. Model-only relationships. In v3.1, relationships of the type 1:1, m:1 and 1:n can be marked as model-only. A model-only relationship isn't required to have a backing foreign key constraint in the relational model data. They're ideal for projects which have to work with relational databases where changes can't always be made or some relationships can't be added to (e.g. the ones which are important for the entity model, but are not allowed to be added to the relational model for some reason). Custom field ordering. Although fields in an entity definition don't really have an ordering, it can be important for some situations to have the entity fields in a given order, e.g. when you use compound primary keys. Field ordering can be defined using a pop-up dialog which can be opened through various ways, e.g. inside the project explorer, model view and entity editor. It can also be set automatically during refreshes based on new settings. Command line relational model data refresher tool, CliRefresher.exe. The command line refresh tool shipped with v2.6 is now available for v3.1 as well Navigation enhancements in various designer elements. It's now easier to find elements like entities, typed views etc. in the project explorer from editors, to navigate to related entities in the project explorer by right clicking a relationship, navigate to the super-type in the project explorer when right-clicking an entity and navigate to the sub-type in the project explorer when right-clicking a sub-type node in the project explorer. Minor visual enhancements / tweaks LLBLGen Pro Runtime Framework Entity creation is now up to 30% faster and takes 5% less memory. Creating an entity object has been optimized further by tweaks inside the framework to make instantiating an entity object up to 30% faster. It now also takes up to 5% less memory than in v3.0 Prefetch Path node merging is now up to 20-25% faster. Setting entity references required the creation of a new relationship object. As this relationship object is always used internally it could be cached (as it's used for syncing only). This increases performance by 20-25% in the merging functionality. Entity fetches are now up to 20% faster. A large number of tweaks have been applied to make entity fetches up to 20% faster than in v3.0. Full WCF RIA support. It's now possible to use your LLBLGen Pro runtime framework powered domain layer in a WCF RIA application using the VS.NET tools for WCF RIA services. WCF RIA services is a Microsoft technology for .NET 4 and typically used within silverlight applications. SQL Server DQE compatibility level is now per instance. (Usable in Adapter). It's now possible to set the compatibility level of the SQL Server Dynamic Query Engine (DQE) per instance of the DQE instead of the global setting it was before. The global setting is still available and is used as the default value for the compatibility level per-instance. You can use this to switch between CE Desktop and normal SQL Server compatibility per DataAccessAdapter instance. Support for COUNT_BIG aggregate function (SQL Server specific). The aggregate function COUNT_BIG has been added to the list of available aggregate functions to be used in the framework. Minor changes / tweaks I'm especially pleased with the import system, as that makes working with entity models a lot easier. The import system lets you import from another LLBLGen Pro v3 project any entity definition, mapping and / or meta-data like table definitions. This way you can build repository projects where you store model fragments, e.g. the building blocks for a customer-order system, a user credential model etc., any model you can think of. In most projects, you'll recognize that some parts of your new model look familiar. In these cases it would have been easier if you would have been able to import these parts from projects you had pre-created. With LLBLGen Pro v3.1 you can. For example, say you have an Oracle schema called CRM which contains the bread 'n' butter customer-order-product kind of model. You create an entity model from that schema and save it in a project file. Now you start working on another project for another customer and you have to use SQL Server. You also start using model-first development, so develop the entity model from scratch as there's no existing database. As this customer also requires some CRM like entity model, you import the entities from your saved Oracle project into this new SQL Server targeting project. Because you don't work with Oracle this time, you don't import the relational meta-data, just the entities, their relationships and possibly their inheritance hierarchies, if any. As they're now entities in your project you can change them a bit to match the new customer's requirements. This can save you a lot of time, because you can re-use pre-fab model fragments for new projects. In the example above there are no tables yet (as you work model first) so using the forward mapping capabilities of LLBLGen Pro v3 creates the tables, PK constraints, Unique Constraints and FK constraints for you. This way you can build a nice repository of model fragments which you can re-use in new projects.

    Read the article

  • Open Source .Net Object Database or Document Database for use in Hosted environment

    - by runxc1 Bret Ferrier
    I am looking at creating a web site and I want to try and learn either a Object Database or a Document Database. I am going to be using a hosting provider so I won't be able to install any software. I am unable to purchase any licensing so I need to be able use either a free or open source Object/Document Database. Are there any free Object/Document Databases that don't require installation of some sort?

    Read the article

  • Passing object from PHP to Mysql Stored procedure

    - by user268982
    Hi All, Scenario :- I have to call MYSQL stored procedure from PHP and do some operations ( around 15 commands ) on the database Problem :- I have to call stored procedure with 36 parameters. Lot of parameters . I don't think it is a good idea to pass these many individual parameters and even heard passing individul parameters increases network traffic. Looking for :- I created a Data Object at PHP side and is there any way I can create similar kind of Object in MYSQL and pass this object as a parameter and extract the data from the object in MYSQL stored procedure Thanks for your help Regards Kiran

    Read the article

  • PHP, jQuery and Ajax Object Orientation

    - by pastylegs
    I'm a fairly experienced programmer getting my head around PHP and Ajax for the first time, and I'm having a bit of trouble figuring out how to incorperate object oriented PHP into my ajax webapp. I have an admin page (admin.php) that will load and write information (info.xml) from an XML file depending on the users selection of a form on the admin page. I have decided to use an object (ContentManager.php) to manage the loading and writing of the XML file to disk, i.e : class ContentManager{ var $xml_attribute_1 ... function __construct(){ //load the xml file from disk and save its contents into variables $xml_attribute = simplexml_load_file(/path/to/xml) } function get_xml_contents(){ return xml_attribute; } function write_xml($contents_{ } function print_xml(){ } } I create the ContentManager object in admin.php like so <?php include '../includes/CompetitionManager.php'; $cm = new CompetitionManager() ?> <script> ...all my jquery </script> <html> ... all my form elements </html> So now I want to use AJAX to allow the user to retrieve information from the XML file via the ContentManger app using an interface (ajax_handler.php) like so <?php if(_POST[]=="get_a"){ }else if() } ... ?> I understand how this would work if I wasn't using objects, i.e. the hander php file would do a certain action depending on a variable in the .post request, but with my setup, I can't see how I can get a reference to the ContentManager object I have created in admin.php in the ajax_handler.php file? Maybe my understanding of php object scope is flawed. Anyway, if anyone can make sense of what I'm trying to do, I would appreciate some help!

    Read the article

  • Explore a COM Object in PHP

    - by shaiss
    What would be the proper way to explode a COM object for debugging? I have a 3rd party function that returns a multilevel object. The documentation is non existant, so I'd like to be able to echo everything out of the object or debug it in Komodo IDE. Komodo just says Object and nothing else. Maybe convert to array? I know some of the existing options such as $com->Status, but there are more variables returned that I'd like to know what they are.

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >