Search Results

Search found 21098 results on 844 pages for 'model import'.

Page 17/844 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Rails Model multiple column uniqueness

    - by Jty.tan
    I am making a Viewer model with belongs_to :users belongs_to :orders that joins the models Users and Orders with a :has_many :through => :viewers. And the Viewer model has the attributes of user_id and order_id. How would I set it up so that new viewers are only accepted if both user_id and order_id are unique in the same row? I remember in MySQL being able to do so with a flag (although I can't for the life of me remember what it was), but I'm not sure how to do it with Rails. Can I do something like (for Viewer.rb) validates_uniqueness_of :user_id, :scope => :order_id?

    Read the article

  • MVC Entity Model not showing my table

    - by Jessica
    I have a database with multiple tables, and some basic relationships. Here is an example of the problem I am having: My Database: **Org** ID Name etc **Detail1** ID D1name **Org_Detail1** Org_ID Detail1_ID **Detail2** ID D2Name **Org_Detail2** Org_ID Detial1_ID BooleanField My problem is, the Org_detail1 table is not showing up in the entity model, but the Org_Details2 table does. I thought it may have been because the Org_Detail1 table only contains two ID fields that are both primary keys, while the Org_Details2 table contains 2 primary key ID fields as well as a boolean field. If I add a dummy field to Org_detail1 and update it, it still won't show up and wont allow me to add a new entity relating to the Org_Detail1 table. The table won't even show up in the list, but it is listed under the tables. Is there any solution to get this table to appear in my model?

    Read the article

  • Rails 2.3.2: Accessing Model Specific Data in Another Model

    - by Gimli
    I'm using Rails 2.3.2 and using Paperclip to upload photos. I'm also using a slightly customized subdomain_accounts.rb to set some account-specific variables. My question is this: How can I set the bucket used in Paperclip to be dependent on the current account? Since this looks to be a model attribute set up early on, how can I override it later? Thanks.

    Read the article

  • Trying and expand the contrib.auth.user model and add a "relatipnships" manage

    - by dotty
    I have the following model setup. from django.db import models from django.contrib.auth.models import User class SomeManager(models.Manager): def friends(self): # return friends bla bla bla class Relationship(models.Model): """(Relationship description)""" from_user = models.ForeignKey(User, related_name='from_user') to_user = models.ForeignKey(User, related_name='to_user') has_requested_friendship = models.BooleanField(default=True) is_friend = models.BooleanField(default=False) objects = SomeManager() relationships = models.ManyToManyField(User, through=Relationship, symmetrical=False) relationships.contribute_to_class(User, 'relationships') Here i take the User object and use contribute_to_class to add 'relationships' to the User object. The relationship show up, but if call User.relationships.friends it should run the friends() method, but its failing. Any ideas how i would do this? Thanks

    Read the article

  • VBA Access Import Specification strange error

    - by captonssj
    I am trying to import a text file into Access using a saved "Import Specification" in my access database using VBA. The import generates the Import Error table showing the errors in the last two fields. BUT if I use the same import specification manually to import the text file, the Import works perfect !!!!! Strange ..... why would they have different behavior ? Here is the database and the text file http://www.box.net/shared/ro7n3b7a77

    Read the article

  • CakePhp: model recursive associations and find

    - by Petecocoon
    Hello to everybody! I've some trouble with a find() on a model on CakePhp. I have three model relationed in this way: Project(some_fields, item_id) ------belongsTo----- Item(some_fields, item_id) ------belongsTo----- User(some_fields campi, nickname) I need to do a find() and retrieve all fields from project, a field from Item and the nickname field from User. This is my code: $this->set('projects', $this->Project->find('all', array('recursive' => 2))); but my output doesn't contains the user object. I've tried with Containable behaviour but the output is the same. What is broken? Many many Thanks Peter

    Read the article

  • How best can I extract a logical model from a physical DB model

    - by Dean
    We have made substantial changes to our physical DB, now as it is the ne dof the project I would like to abstract a logical model from this, to allow me to generate schemas for both Oracle and SQL Server. Can anyone guide me as to the best way to achieve this. I was hoping TOAD data modeller would help but I can't seem to see any options to do what I require?

    Read the article

  • Define Rails Model Persistent Attributes in Model File

    - by Kevin Sylvestre
    I recently played with MongoDB in Rails using Mongoid. I like the ability to define attributes for models within the model file (as opposed to in migrations): class Person include Mongoid::Document field :name, :type => String field :birthday, :type => Date end For projects that cannot use a schema-less database, does a similar feature exist? Any gems or plugins that generate schemas from a similar syntax would be greatly appreciated. Thanks.

    Read the article

  • Can't call method in model table class using Doctrine with Zend Framework

    - by Jeremy Hicks
    I'm using Doctrine with Zend Framework. For my model, I'm using a base class, the regular class (which extends the base class), and a table class. In my table class, I've created a method which does a query for records with a specific value for one of the fields in my model. When I try and call this method from my controller, I get an error message saying, "Message: Unknown method Doctrine_Table::getCreditPurchases". Is there something else I need to do to call functions in my table class? Here is my code: class Model_CreditTable extends Doctrine_Table { /** * Returns an instance of this class. * * @return object Model_CreditTable */ public static function getInstance() { return Doctrine_Core::getTable('Model_Credit'); } public function getCreditPurchases($id) { $q = $this->createQuery('c') ->where('c.buyer_id = ?', $id); return $q->fetchArray(); } } // And then in my controller method I have... $this->view->credits = Doctrine_Core::getTable('Model_Credit')->getCreditPurchases($ns->id);

    Read the article

  • Rails Model has_many with multiple foreign_keys

    - by Kenzie
    Relatively new to rails and trying to model a very simple family "tree" with a single Person model that has a name, gender, father_id and mother_id (2 parents). Below is basically what I want to do, but obviously I can't repeat the :children in a has_many (the first gets overwritten). class Person < ActiveRecord::Base belongs_to :father, :class_name => 'Person' belongs_to :mother, :class_name => 'Person' has_many :children, :class_name => 'Person', :foreign_key => 'mother_id' has_many :children, :class_name => 'Person', :foreign_key => 'father_id' end Is there a simple way to use has_many with 2 foreign keys, or maybe change the foreign key based on the object's gender? Or is there another/better way altogether? Thanks!

    Read the article

  • Getting the value from an array in the model in rails

    - by slythic
    Hi all, I have a relatively simple problem. I have a model named Item which I've added a status field. The status field will only have two options (Lost or Found). So I created the following array in my Item model: STATUS = [ [1, "Lost"], [2, "Found"]] In my form view I added the following code which works great: <%= collection_select :item, :status, Item::STATUS, :first, :last, {:include_blank => 'Select status'} %> This stores the numeric id (1 or 2) of the status in the database. However, in my show view I can't figure out how to convert from the numeric id (again, 1 or 2) to the text equivalent of Lost or Found. Any ideas on how to get this to work? Is there a better way to go about this? Many thanks, Tony

    Read the article

  • RoR model field without validators, has*, delegates, etc

    - by jackr
    How can I declare a field, in the Rails model, when it doesn't have any "has_" relations, or validations, or delegations? I just need to ensure its existence and column width in the schema. Currently, I have no mention of the field in the "schema section" of the model file, but it's referenced in various methods that use it, and this seems to work. However, depending on my exact creation workflow, the underlying database table may be created as t.binary "field_name", :limit => 32 or t.binary "field_name", :limit => 255 This is not a restriction on the value (any binary value is valid, even NULL), only on the table column declaration. As it happens, 32 is enough -- it never receives any larger value, it's only ever written to like this: self.field_name = SecureRandom.random_bytes(32)

    Read the article

  • OpenERP model tracking external resource, hooking into parent's copy() or copy_data()

    - by CB.
    I have added a new model (call it crm_lead_external) that is linked via a new one2many on crm_lead. Thus, my module has two models defined: an updated crm_lead (with _name=crm_lead) and a new crm_lead_external. This external model tracks a file and as such has a 'filename' field. I also created a unique SQL index on this filename field. This is part of my module: def copy(self, cr, uid, id, default=None, context=None): if not default: default = {} default.update({ 'state': 'new', 'filename': '', }) ret = super(crm_lead_external, self).copy(cr, uid, id, default, context=context) #do file copy return ret The intent here is to allow an external entity to be duplicated, but to retarget the file path. Now, if I click duplicate on the Lead, I get an IntegrityError on my unique constraint. Is there a particular reason why copy() isn't being called? Should I add this logic to copy_data()? Myst I really override copy() for the lead? Thanks in advance.

    Read the article

  • Rails: Creating subfolders in model?

    - by keruilin
    I'm going to have a ton of subclasses, so want to organize them under a subfolder called stream. I added the following line to the environment.rb so that all classes in the subfolder would be loaded: Rails::Initializer.run do |config| ... config.load_paths += Dir["#{RAILS_ROOT}/app/models/*"].find_all { |f| File.stat(f).directory? } ... end I thought this would solve the issue in which by convention the model class is namespaced into an according module. However, when I try to call one of the classes called stream in the stream folder, I get the following error: NoMethodError: undefined method `new' for Stream:Module from (irb):28 from /usr/local/bin/irb:12:in `<main>' Here's the model for the parent and one child: class Stream end class EventStream < Stream end Any idea what the issue is?

    Read the article

  • Ruby on Rails Increment Counter in Model

    - by febs
    I'm attempting to increment a counter in my User table from another model. class Count < ActiveRecord::Base belongs_to :user after_create :update_count def update_count user = User.find(self.user_id) user.increment(:count) end end So when count is created the goal would be to increment a counter column for that user. Currently it refuses to get the user after creation and I get a nil error. I'm using devise for my Users Is this the right (best practice) place to do it? I had it working in the controllers, but wanted to clean it up. I'm very inexperienced with Model callbacks.

    Read the article

  • ASSIMP in my program is much slower to import than ASSIMP view program

    - by Marco
    The problem is really simple: if I try to load with the function aiImportFileExWithProperties a big model in my software (around 200.000 vertices), it takes more than one minute. If I try to load the very same model with ASSIMP view, it takes 2 seconds. For this comparison, both my software and Assimp view are using the dll version of the library at 64 bit, compiled by myself (Assimp64.dll). This is the relevant piece of code in my software // default pp steps unsigned int ppsteps = aiProcess_CalcTangentSpace | // calculate tangents and bitangents if possible aiProcess_JoinIdenticalVertices | // join identical vertices/ optimize indexing aiProcess_ValidateDataStructure | // perform a full validation of the loader's output aiProcess_ImproveCacheLocality | // improve the cache locality of the output vertices aiProcess_RemoveRedundantMaterials | // remove redundant materials aiProcess_FindDegenerates | // remove degenerated polygons from the import aiProcess_FindInvalidData | // detect invalid model data, such as invalid normal vectors aiProcess_GenUVCoords | // convert spherical, cylindrical, box and planar mapping to proper UVs aiProcess_TransformUVCoords | // preprocess UV transformations (scaling, translation ...) aiProcess_FindInstances | // search for instanced meshes and remove them by references to one master aiProcess_LimitBoneWeights | // limit bone weights to 4 per vertex aiProcess_OptimizeMeshes | // join small meshes, if possible; aiProcess_SplitByBoneCount | // split meshes with too many bones. Necessary for our (limited) hardware skinning shader 0; cout << "Loading " << pFile << "... "; aiPropertyStore* props = aiCreatePropertyStore(); aiSetImportPropertyInteger(props,AI_CONFIG_IMPORT_TER_MAKE_UVS,1); aiSetImportPropertyFloat(props,AI_CONFIG_PP_GSN_MAX_SMOOTHING_ANGLE,80.f); aiSetImportPropertyInteger(props,AI_CONFIG_PP_SBP_REMOVE, aiPrimitiveType_LINE | aiPrimitiveType_POINT); aiSetImportPropertyInteger(props,AI_CONFIG_GLOB_MEASURE_TIME,1); //aiSetImportPropertyInteger(props,AI_CONFIG_PP_PTV_KEEP_HIERARCHY,1); // Call ASSIMPs C-API to load the file scene = (aiScene*)aiImportFileExWithProperties(pFile.c_str(), ppsteps | /* default pp steps */ aiProcess_GenSmoothNormals | // generate smooth normal vectors if not existing aiProcess_SplitLargeMeshes | // split large, unrenderable meshes into submeshes aiProcess_Triangulate | // triangulate polygons with more than 3 edges //aiProcess_ConvertToLeftHanded | // convert everything to D3D left handed space aiProcess_SortByPType | // make 'clean' meshes which consist of a single typ of primitives 0, NULL, props); aiReleasePropertyStore(props); if(!scene){ cout << aiGetErrorString() << endl; return 0; } this is the relevant piece of code in assimp view code // default pp steps unsigned int ppsteps = aiProcess_CalcTangentSpace | // calculate tangents and bitangents if possible aiProcess_JoinIdenticalVertices | // join identical vertices/ optimize indexing aiProcess_ValidateDataStructure | // perform a full validation of the loader's output aiProcess_ImproveCacheLocality | // improve the cache locality of the output vertices aiProcess_RemoveRedundantMaterials | // remove redundant materials aiProcess_FindDegenerates | // remove degenerated polygons from the import aiProcess_FindInvalidData | // detect invalid model data, such as invalid normal vectors aiProcess_GenUVCoords | // convert spherical, cylindrical, box and planar mapping to proper UVs aiProcess_TransformUVCoords | // preprocess UV transformations (scaling, translation ...) aiProcess_FindInstances | // search for instanced meshes and remove them by references to one master aiProcess_LimitBoneWeights | // limit bone weights to 4 per vertex aiProcess_OptimizeMeshes | // join small meshes, if possible; aiProcess_SplitByBoneCount | // split meshes with too many bones. Necessary for our (limited) hardware skinning shader 0; aiPropertyStore* props = aiCreatePropertyStore(); aiSetImportPropertyInteger(props,AI_CONFIG_IMPORT_TER_MAKE_UVS,1); aiSetImportPropertyFloat(props,AI_CONFIG_PP_GSN_MAX_SMOOTHING_ANGLE,g_smoothAngle); aiSetImportPropertyInteger(props,AI_CONFIG_PP_SBP_REMOVE,nopointslines ? aiPrimitiveType_LINE | aiPrimitiveType_POINT : 0 ); aiSetImportPropertyInteger(props,AI_CONFIG_GLOB_MEASURE_TIME,1); //aiSetImportPropertyInteger(props,AI_CONFIG_PP_PTV_KEEP_HIERARCHY,1); // Call ASSIMPs C-API to load the file g_pcAsset->pcScene = (aiScene*)aiImportFileExWithProperties(g_szFileName, ppsteps | /* configurable pp steps */ aiProcess_GenSmoothNormals | // generate smooth normal vectors if not existing aiProcess_SplitLargeMeshes | // split large, unrenderable meshes into submeshes aiProcess_Triangulate | // triangulate polygons with more than 3 edges aiProcess_ConvertToLeftHanded | // convert everything to D3D left handed space aiProcess_SortByPType | // make 'clean' meshes which consist of a single typ of primitives 0, NULL, props); aiReleasePropertyStore(props); As you can see the code is nearly identical because I copied from assimp view. What could be the reason for such a difference in performance? The two software are using the same dll Assimp64.dll (compiled in my computer with vc++ 2010 express) and the same function aiImportFileExWithProperties to load the model, so I assume that the actual code employed is the same. How is it possible that the function aiImportFileExWithProperties is 100 times slower when called by my sotware than when called by assimp view? What am I missing? I am not good with dll, dynamic and static libraries so I might be missing something obvious. ------------------------------ UPDATE I found out the reason why the code is going slower. Basically I was running my software with "Start debugging" in VC++ 2010 Express. If I run the code outside VC++ 2010 I get same performance of assimp view. However now I have a new question. Why does the dll perform slower in VC++ debugging? I compiled it in release mode without debugging information. Is there any way to have the dll go fast in debugmode i.e. not debugging the dll? Because I am interested in debugging only my own code, not the dll that I assume is already working fine. I do not want to wait 2 minutes every time I want to load my software to debug. Does this request make sense?

    Read the article

  • Testing subpackage modules in Python 3

    - by Mitchell Model
    I have been experimenting with various uses of hierarchies like this and the differences between absolute and relative imports, and can't figure out how to do routine things with the package, subpackages, and modules without simply putting everything on sys.path. I have a two-level package hierarchy: MyApp __init__.py Application __init__.py Module1 Module2 ... Domain __init__.py Module1 Module2 ... UI __init__.py Module1 Module2 ... I want to be able to do the following: Run test code in a Module's "if main" when the module imports from other modules in the same directory. Have one or more test code modules in each subpackage that runs unit tests on the modules in the subpackage. Have a set of unit tests that reside in someplace reasonable, but outside the subpackages, either in a sibling package, at the top-level package, or outside the top-level package (though all these might end up doing is running the tests in each subpackage) "Enter" the structure from any of the three subpackage levels, e.g. run code that just uses Domain modules, run code that just uses Application modules, but Application uses code from both Application and Domain modules, and run code from GUI uses code from both GUI and Application; for instance, Application test code would import Application modules but not Domain modules. After developing the bulk of the code without subpackages, continue developing and testing after organizing the modules into this hierarchy. I know how to use relative imports so that external code that puts MyApp on its sys.path can import MyApp, import any subpackages it wants, and import things from their modules, while the modules in each subpackage can import other modules from the same subpackage or from sibling packages. However, the development needs listed above seem incompatible with subpackage structuring -- in other words, I can't have it both ways: a well-structured multi-level package hierarchy used from the outside and also used from within, in particular for testing but also because modules from one design level (in particular the UI) should not import modules from a design level below the next one down. Sorry for the long essay, but I think it fairly represents the struggles a lot of people have been having adopting to the new relative import mechanisms.

    Read the article

  • How Do You Actually Model Data?

    Since the 1970’s Developers, Analysts and DBAs have been able to represent concepts and relations in the form of data through the use of generic symbols.  But what is data modeling?  The first time I actually heard this term I could not understand why anyone would want to display a computer on a fashion show runway. Hey, what do you expect? At that time I was a freshman in community college, and obviously this was a long time ago.  I have since had the chance to learn what data modeling truly is through using it. Data modeling is a process of breaking down information and/or requirements in to common categories called objects. Once objects start being defined then relationships start to form based on dependencies found amongst other existing objects.  Currently, there are several tools on the market that help data designer actually map out objects and their relationships through the use of symbols and lines.  These diagrams allow for designs to be review from several perspectives so that designers can ensure that they have the optimal data design for their project and that the design is flexible enough to allow for potential changes and/or extension in the future. Additionally these basic models can always be further refined to show different levels of details depending on the target audience through the use of three different types of models. Conceptual Data Model(CDM)Conceptual Data Models include all key entities and relationships giving a viewer a high level understanding of attributes. Conceptual data model are created by gathering and analyzing information from various sources pertaining to a project during the typical planning phase of a project. Logical Data Model (LDM)Logical Data Models are conceptual data models that have been expanded to include implementation details pertaining to the data that it will store. Additionally, this model typically represents an origination’s business requirements and business rules by defining various attribute data types and relationships regarding each entity. This additional information can be directly translated to the Physical Data Model which reduces the actual time need to implement it. Physical Data Model(PDMs)Physical Data Model are transformed Logical Data Models that include the necessary tables, columns, relationships, database properties for the creation of a database. This model also allows for considerations regarding performance, indexing and denormalization that are applied through database rules, data integrity. Further expanding on why we actually use models in modern application/database development can be seen in the benefits that data modeling provides for data modelers and projects themselves, Benefits of Data Modeling according to Applied Information Science Abstraction that allows data designers remove concepts and ideas form hard facts in the form of data. This gives the data designers the ability to express general concepts and/or ideas in a generic form through the use of symbols to represent data items and the relationships between the items. Transparency through the use of data models allows complex ideas to be translated in to simple symbols so that the concept can be understood by all viewpoints and limits the amount of confusion and misunderstanding. Effectiveness in regards to tuning a model for acceptable performance while maintaining affordable operational costs. In addition it allows systems to be built on a solid foundation in terms of data. I shudder at the thought of a world without data modeling, think about it? Data is everywhere in our lives. Data modeling allows for optimizing a design for performance and the reduction of duplication. If one was to design a database without data modeling then I would think that the first things to get impacted would be database performance due to poorly designed database and there would be greater chances of unnecessary data duplication that would also play in to the excessive query times because unneeded records would need to be processed. You could say that a data designer designing a database is like a box of chocolates. You will never know what kind of database you will get until after it is built.

    Read the article

  • how do I import a targa sequence as a composition in after effects cs4 ?

    - by George Profenza
    I am a complete n00b with After Effects now, but I want to achieve something basic. I have a sequence of TARGA(.tga) files named alphanumerically(name_0000, where 0000 is the frame). I want to import those as a composition. What I'm after is having each .tga file siting in the timeline in sequence. I tried File Import File, selected the first, then selected the last, holding Shift, and ticked sequence, but I cannot select Composition from that Dialog, I can only select Footage and I don't knwow why. Hints ?

    Read the article

  • What is a good automated data import method for SQL Server?

    - by Joel Potter
    I'm in the process of porting some SQL Server 2005 databases to SQL Server 2008. One of these databases has an associated import application (Windows task) which uses SSIS with a DTS package to import a large dataset from an MS Access database nightly. In upgrading to SQL Server 2008, I discovered that I can't run the same console application which has been performing the imports due to the missing manageddts DLL in SQL Server 2008. It's several years old and in need of a rewrite for various reason, plus, I've been fairly unhappy with DTS in general. The original reason DTS was chosen was for speed (5 min import time compared to 30+ for ADO.NET). The format of the data to import is out of my control (the client likes Access). I would also like to be able to run the import from a machine completely separate from the server hosting SQL Server and preferably with minimal SQL features installed. Options I've considered: Creating an Access application to connect to both databases (SQL Server and Access) and perform the import (Ugh!) Revisiting ADO.NET to see if the original implementation was poorly written. Updated SSIS packages. What other technologies should I be considering for this job?

    Read the article

  • Should I *always* import my file references into the database in drupal?

    - by sprugman
    I have a cck type with an image field, and a unique_id text field. The file name of the image is based on the unique_id. All of the content, including the image itself is being generated automatically via another process, and I'm parsing what that generates into nodes. Rather than creating separate fields for the id and the image, and doing an official import of the image into the files table, I'm tempted to only create the id field and create the file reference in the theme layer. I can think of pros and cons: 1) Theme Layer Approach Pros: makes the import process much less complex don't have to worry about syncing the db with the file system as things change more flexible -- I can move my images around more easily if I want Cons: maybe not as much The Drupal Way™ not as pure -- I'll wind up with more logic on the theme side. 2) Import Approach Pros: import method is required if we ever wanted to make the files private (we won't.) safer? Maybe I'll know if there's a problem with the image at import time, rather than view time. Since I'll be bulk importing, that might make a difference. if I delete a node through the admin interface, drupal might be able to delete the file for me, as well. Con: more complex import and maintenance All else being equal, simpler is always better, so I'm leaning toward #1. Are there any other issues I'm missing? (Since this is an open ended question, I guess I'll make it a community wiki, whatever that means.)

    Read the article

  • How To Export/Import a Website in IIS 7.x

    - by Tray Harrison
    IIS 6 had a great feature called ‘Save Configuration to a File’ which would allow you to easily export a website’s configuration, to be later used to import either on the same server or another box.  This came in handy anytime you wanted to duplicate a site in order to do some testing without impacting the existing application.  So naturally, Microsoft decided to do away with this feature in IIS 7. The process to export/import a site is still fairly simple, though not as obvious as it was in previous versions.  Here are the steps: 1. Open a command prompt and navigate to C:\Windows\System32\inetsrv and run the following command: appcmd list site /name:<sitename> /config /xml > C:\output.xml So if you were wanting to export a website named EAC, you would run the following: If you’ll be setting up another copy of the site on the same server, you’ll now need to edit the output.xml file before importing it.  This is necessary in order to avoid conflicts such as bindings, Site ID, etc.  To do this, edit the XML and change the values.  Go ahead and make a copy of the home directory, and rename it to whatever folder name you specified in the output – /EAC2 in this example.  If you decide to change the app pool, make sure you go ahead and create the new app pool as well. Once these edits have been made, we are now ready to import the site.  To do that run: appcmd add sites /in < c:\output.xml So for our example it would look like this: That’s it.  You should now see your site listed when opening up Inet Manager.  If for some reason the site fails to start, that’s probably because you forgot to create the new app pool or there is a problem with one of the other parameters you changed.  Look at the System log to identify any issues like this.

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >