Search Results

Search found 16022 results on 641 pages for 'model export'.

Page 33/641 | < Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >

  • Exporting spritesheet for Cocos2d

    - by Terko
    I would like to know how people usually save the animations in order to load them easily in Cocos2d with as few hard-code as possible. E.G. The solution I thought of is to have one plist file containing information about each frame, and the second plist to contain information about each of the animation(name of the animation, which frames to play, and the delay probably). If this is the correct solution, how can I generate such plist files for spritesheet automatically?

    Read the article

  • Going Beyond the Relational Model with Data

    SQL is a powerful tool for querying data, and for aggregating it. However, you can't easily use it to draw inferences, to make predictions, or to tease out subtle correlations. To provide ever more sophisticated inferences to businesses, the race is on to combine the power of the relational model with advanced statistical packages. Both IBM and PostGres are ready with solutions. And SQL Server? Hmm...

    Read the article

  • How can I share an entity framework model across website users

    - by richardmoss
    Hello, Currently my website is based around MVC and the Entity Framework running against a SQL Server 2005 database. So far, it has all been running very smoothly, and I really enjoy MVC and its slimmer more concise code (and no huge viewstates or soul destroying postbacks ;)) Recently I was working on upgrading the site to use a simple forum system, and this is where I started running into problems. When I was testing the site using two different browsers, if I created or replied to a post in one browser, the other browser couldn't see the post. At the moment, each visitor to the site gets their own copy of the entity model, which I store in their session data. Obviously this is the problem as updates to one model aren't getting carried to the other. As a test, I tried storing a single copy of the model which all visitors would access by assigning the model to a static variable. This worked, and both browsers could see each others modifications. However, it had its side effects. For example, if I fired up both browsers at the same time and the model was initialized, one browser would crash, and the other would work fine, despite me using a locking object so in theory one of them should have been delayed until the model was ready (of course I could have implemented this wrong ;)). Also, originally this site did use one model for all visitors and when it was live, it frequently shut down - killing the IIS application pool while it did. Now I'm not sure if this was related, but I don't really want to reintroduce whatever bug I had that caused this shut down. So, my question is a simple one really - what is the best way of either using the same model for all website users so they all see updates, or if they do have separate copies (which I imagine will have a performance impact in time) how can the models detect changes in the database and update themselves according. Thanks in advance for any advice! Regards; Richard Moss

    Read the article

  • How to export SQL Server data from corrupted database (with disk write error)

    - by damitamit
    IT realised there was a disk write error on our production SQL Server 2005 and hence was causing the backups to fail. By the time they had realised this the nightly backup was old, so were not able to just restore the backup on another server. The database is still running and being used constantly. However DBCC CheckDB fails. Also the SQL Server backup task fails, Copy Database fails, Export Data Wizard fails. However it seems all the data can be read from the tables (i.e using bcp etc) Another observation I have made is that the Transaction Log is nearly double the size of the Database. (Does that mean all the changes arent being written to the MDF?) What would be the best plan of attack to get the database to a state where backups are working and the data is safe? Take the database offline and use the MDF/LDF to somehow create the database on another sql server? Export the data from the database using bcp. Create the database (use the Generate Scripts function on the corrupt db to create the schema on the new db) on another sql server and use bcp again to import the data. Some other option that is the right course of action in this situation? The IT manager says the data is safe as if the server fails, the data can be restored from the mdf/ldf. I'm not sure so insisted that we start exporting the data each night as a failsafe (using bcp for example). IT are also having issues on the hardware side of things as supposedly the disk error in on a virtualized disk and can't be rebuilt like a normal raid array (or something like that). Please excuse my use of incorrect terminology and incorrect assumptions on how Sql Server operates. I'm the application developer and have been called to help (as it seems IT know less about SQL Server than I do). Many Thanks, Amit Results of DBBC CheckDB: Msg 1823, Level 16, State 2, Line 1 A database snapshot cannot be created because it failed to start. Msg 7928, Level 16, State 1, Line 1 The database snapshot for online checks could not be created. Either the reason is given in a previous error or one of the underlying volumes does not support sparse files or alternate streams. Attempting to get exclusive access to run checks offline. Msg 5030, Level 16, State 12, Line 1 The database could not be exclusively locked to perform the operation. Msg 7926, Level 16, State 1, Line 1 Check statement aborted. The database could not be checked as a database snapshot could not be created and the database or table could not be locked. See Books Online for details of when this behavior is expected and what workarounds exist. Also see previous errors for more details. Msg 823, Level 24, State 3, Line 1 The operating system returned error 1(error not found) to SQL Server during a write at offset 0x00000674706000 in file 'G:\AX40_Dynamics_Live.mdf'. Additional messages in the SQL Server error log and system event log may provide more detail. This is a severe system-level error condition that threatens database integrity and must be corrected immediately. Complete a full database consistency check (DBCC CHECKDB). This error can be caused by many factors; for more information, see SQL Server Books Online.

    Read the article

  • Backbone.js Model change events in nested collections not firing as expected

    - by Pallavi Kaushik
    I'm trying to use backbone.js in my first "real" application and I need some help debugging why certain model change events are not firing as I would expect. I have a web service at /employees/{username}/tasks which returns a JSON array of task objects, with each task object nesting a JSON array of subtask objects. For example, [{ "id":45002, "name":"Open Dining Room", "subtasks":[ {"id":1,"status":"YELLOW","name":"Clean all tables"}, {"id":2,"status":"RED","name":"Clean main floor"}, {"id":3,"status":"RED","name":"Stock condiments"}, {"id":4,"status":"YELLOW","name":"Check / replenish trays"} ] },{ "id":47003, "name":"Open Registers", "subtasks":[ {"id":1,"status":"YELLOW","name":"Turn on all terminals"}, {"id":2,"status":"YELLOW","name":"Balance out cash trays"}, {"id":3,"status":"YELLOW","name":"Check in promo codes"}, {"id":4,"status":"YELLOW","name":"Check register promo placards"} ] }] Another web service allows me to change the status of a specific subtask in a specific task, and looks like this: /tasks/45002/subtasks/1/status/red [aside - I intend to change this to a HTTP Post-based service, but the current implementation is easier for debugging] I have the following classes in my JS app: Subtask Model and Subtask Collection var Subtask = Backbone.Model.extend({}); var SubtaskCollection = Backbone.Collection.extend({ model: Subtask }); Task Model with a nested instance of a Subtask Collection var Task = Backbone.Model.extend({ initialize: function() { // each Task has a reference to a collection of Subtasks this.subtasks = new SubtaskCollection(this.get("subtasks")); // status of each Task is based on the status of its Subtasks this.update_status(); }, ... }); var TaskCollection = Backbone.Collection.extend({ model: Task }); Task View to renders the item and listen for change events to the model var TaskView = Backbone.View.extend({ tagName: "li", template: $("#TaskTemplate").template(), initialize: function() { _.bindAll(this, "on_change", "render"); this.model.bind("change", this.on_change); }, ... on_change: function(e) { alert("task model changed!"); } }); When the app launches, I instantiate a TaskCollection (using the data from the first web service listed above), bind a listener for change events to the TaskCollection, and set up a recurring setTimeout to fetch() the TaskCollection instance. ... TASKS = new TaskCollection(); TASKS.url = ".../employees/" + username + "/tasks" TASKS.fetch({ success: function() { APP.renderViews(); } }); TASKS.bind("change", function() { alert("collection changed!"); APP.renderViews(); }); // Poll every 5 seconds to keep the models up-to-date. setInterval(function() { TASKS.fetch(); }, 5000); ... Everything renders as expected the first time. But at this point, I would expect either (or both) a Collection change event or a Model change event to get fired if I change a subtask's status using my second web service, but this does not happen. Funnily, I did get change events to fire if I added one additional level of nesting, with the web service returning a single object that has the Tasks Collection embedded, for example: "employee":"pkaushik", "tasks":[{"id":45002,"subtasks":[{"id":1..... But this seems klugey... and I'm afraid I haven't architected my app right. I'll include more code if it helps, but this question is already rather verbose. Thoughts?

    Read the article

  • Unit Testing (xUnit) an ASP.NET Mvc Controller with a custom input model?

    - by Danny Douglass
    I'm having a hard time finding information on what I expect to be a pretty straightforward scenario. I'm trying to unit test an Action on my ASP.NET Mvc 2 Controller that utilizes a custom input model w/ DataAnnotions. My testing framework is xUnit, as mentioned in the title. Here is my custom Input Model: public class EnterPasswordInputModel { [Required(ErrorMessage = "")] public string Username { get; set; } [Required(ErrorMessage = "Password is a required field.")] public string Password { get; set; } } And here is my Controller (took out some logic to simplify for this ex.): [HttpPost] public ActionResult EnterPassword(EnterPasswordInputModel enterPasswordInput) { if (!ModelState.IsValid) return View(); // do some logic to validate input // if valid - next View on successful validation return View("NextViewName"); // else - add and display error on current view return View(); } And here is my xUnit Fact (also simplified): [Fact] public void EnterPassword_WithValidInput_ReturnsNextView() { // Arrange var controller = CreateLoginController(userService.Object); // Act var result = controller.EnterPassword( new EnterPasswordInputModel { Username = username, Password = password }) as ViewResult; // Assert Assert.Equal("NextViewName", result.ViewName); } When I run my test I get the following error on my test fact when trying to retrieve the controller result (Act section): System.NullReferenceException: Object reference not set to an instance of an object. Thanks in advance for any help you can offer!

    Read the article

  • Hierarchical object model with property inheritance and event bubbling?

    - by Winston Fassett
    I'm writing a document-based client application and I need a DOM or WPF-like, but non-visual model that: Is a tree composed of elements Can accept an unlimited number of custom properties that get/set any CLR type, including collections. Can inherit their values from their parent Can inherit their default values from an ancestor Can be derived/calculated from other properties, ancestors, or descendants Support event bubbling / tunneling There will be a core set of properties but other plugins may add their own or even create custom documents Supports full inspection by the owning document in order to persist the tree and attributes in an XML format. I realize that's a tall order but I was really hoping there would be something out there to help me get started. Unfortunately WPF DependencyObjects are too closed, proprietary, and coupled to WPF to be of any use as a document model. My needs also have a strong resemblance to the HTML DOM but I haven't been able to find any clean DOM implementations that could be decoupled from HTML or ported to .NET. My current platform is .NET/C# but if anyone knows of anything that might be useful for inspiration or embedding, regardless of the platform, I'd love to know.

    Read the article

  • Zend Framework-where do calls to my methods go? Controller of Model?

    - by Joel
    Hi guys, I'm confused about exactly what I should have in my controller and what in my method. Specifically, I have this in the action method: public function upcomingshowsAction() { $gcal = $this->_validateCalendarConnection(); $uncleanedFeedArray = $this->_getCalendarFeed($gcal); $finishedFeedArray = $this->_cleanFeed($uncleanedFeedArray); $this->view->googleArray = $finishedFeedArray; } And then (incorrectly I know), I have my methods still in the bottom of my controller. So what I'm wondering, is for those methods in the upcomingshowsAction method, should all the actual methods just be in one model and then I'd have something like this: public function upcomingshowsAction() { $finishedFeedArray = new Application_Model_calendarModelPage(); $this->view->googleArray = $finishedFeedArray; } And then something like this in the model: class Application_Model_CalendarModelPage { $gcal = $this->_validateCalendarConnection(); $uncleanedFeedArray = $this->_getCalendarFeed($gcal); $finishedFeedArray = $this->_cleanFeed($uncleanedFeedArray); public functions { ... ... ... } } Am I on the right track here? Thanks!

    Read the article

  • Export enviroment variable Mac

    - by mujer esponja
    Hello, I am starting with spring-roo, so I downloaded it and now I'm trying to export the variable. To get it, I tryed: PATH=$PATH:/Users/myUsr/spring-roo/bin export PATH PATH variable my-Name-3:~ myUsr$ echo $PATH /sw/bin:/sw/sbin:/usr/local/mysql/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/X11/bin:/usr/X11R6/bin:/Users/myUsr/spring-roo/bin In this path (/Users/myUsr/spring-roo/bin), there is roo.sh But then, I can not run the command roo as it is suposed to be. Any idea?? Thanks in advance

    Read the article

  • Import/Export HTML5 localStorage data

    - by hamen
    Hi all, I'm working on a simple TODO list app based on localStorage HTML5 feature: http://hamen.github.com/webnotes/ I'm wondering if it's possible to import/export data in some way. How could I provide an "Export note/Import note" feature to make users being able to save their note on their HD and import them in another browser profile? Thanks

    Read the article

  • Joomla - Force File Download / CSV Export

    - by lautaro.dragan
    I'm in need of help... this is my first time asking a question in SO, so please be kind :) I'm trying to force-download a file from php, so when the user hits a certain button, he gets a file download. The file is a csv (email, username) of all registered users. I decided to add this button to the admin users screen, as you can see in this screenshot. So I added the following code to the addToolbar function in administrator/components/com_users/views/users/view.html.php: JToolBarHelper::custom('users.export', 'export.png', 'export_f2.png', 'Exportar', false); This button is mapped to the following function in the com_users\controller\users.php controller: public function exportAllUsers() { ob_end_clean(); $app = JFactory::getApplication(); header("Content-type: text/csv"); header("Content-Disposition: attachment; filename=ideary_users.csv"); header("Pragma: no-cache"); header("Expires: 0"); echo "email,name\n"; $model = $this->getModel("Users"); $users = $model->getAllUsers(); foreach ($users as $user) { echo $user->email . ", " . ucwords(trim($user->name)) . "\r\n"; } $app->close(); } Now, this is actually working perfectly fine. The issue here is that after I download a file, if I hit any button in the admin that causes a POST, instead of it performing the action it should, it just downloads the file over again! For example: I hit the "Export" button "users.csv" downloads Then, I hit the "search" button "users.csv" downloads... what the hell? I'm guessing that when I hit the export button, a JS gets called and sets a form's action attribute to an URL... and expects a response or something, and then other button's are prevented from re-setting the form's action attribute. I can't think of any real solution for this, but I'd rather avoid hacks if possible. So, what would be the standard, elegant solution that joomla offers in this case?

    Read the article

  • Export Flash Animation to ActionScript

    - by user107715
    Hi I would like to make an animation using some tool (like e.g. Flash CS) and then export it for ActionScript. I thought this could be done in Flash CS but I just tried it by making an animation, converting it to a symbol and then selecting "Export for ActionScript". The problem: in the project folder there are no ActionScript classes generated. Am I doing it wrong or do I have to use a different tool to do the animation? Many many thanks for your help, Gloria

    Read the article

  • SVN export just the changed files from tags

    - by Eric
    Does anyone know how to export only the changed files from two tags using svn? Lets say I have tag 1.0 and then later fix bugs in the trunk. Next I am ready for a new patch release so I tag it 1.1. Now I want to export the changed files between tag 1.0 and 1.1. Is this possible?

    Read the article

  • A basic T4 template for generating Model Metadata in ASP.NET MVC2

    - by rajbk
    I have been learning about T4 templates recently by looking at the awesome ADO.NET POCO entity generator. By using the POCO entity generator template as a base, I created a T4 template which generates metadata classes for a given Entity Data Model. This speeds coding by reducing the amount of typing required when creating view specific model and its metadata. To use this template, Download the template provided at the bottom. Set two values in the template file. The first one should point to the EDM you wish to generate metadata for. The second is used to suffix the namespace and classes that get generated. string inputFile = @"Northwind.edmx"; string suffix = "AutoMetadata"; Add the template to your MVC 2 Visual Studio 2010 project. Once you add it, a number of classes will get added to your project based on the number of entities you have.    One of these classes is shown below. Note that the DisplayName, Required and StringLength attributes have been added by the t4 template. //------------------------------------------------------------------------------ // <auto-generated> // This code was generated from a template. // // Changes to this file may cause incorrect behavior and will be lost if // the code is regenerated. // </auto-generated> //------------------------------------------------------------------------------   using System; using System.ComponentModel; using System.ComponentModel.DataAnnotations;   namespace NorthwindSales.ModelsAutoMetadata { public partial class CustomerAutoMetadata { [DisplayName("Customer ID")] [Required] [StringLength(5)] public string CustomerID { get; set; } [DisplayName("Company Name")] [Required] [StringLength(40)] public string CompanyName { get; set; } [DisplayName("Contact Name")] [StringLength(30)] public string ContactName { get; set; } [DisplayName("Contact Title")] [StringLength(30)] public string ContactTitle { get; set; } [DisplayName("Address")] [StringLength(60)] public string Address { get; set; } [DisplayName("City")] [StringLength(15)] public string City { get; set; } [DisplayName("Region")] [StringLength(15)] public string Region { get; set; } [DisplayName("Postal Code")] [StringLength(10)] public string PostalCode { get; set; } [DisplayName("Country")] [StringLength(15)] public string Country { get; set; } [DisplayName("Phone")] [StringLength(24)] public string Phone { get; set; } [DisplayName("Fax")] [StringLength(24)] public string Fax { get; set; } } } The gen’d class can be used from your project by creating a partial class with the entity name and setting the MetadataType attribute.namespace MyProject.Models{ [MetadataType(typeof(CustomerAutoMetadata))] public partial class Customer { }} You can also copy the code in the metadata class generated and create your own ViewModel class. Note that the template is super basic  and does not take into account complex properties. I have tested it with the Northwind database. This is a work in progress. Feel free to modify the template to suite your requirements. Standard disclaimer follows: Use At Your Own Risk, Works on my machine running VS 2010 RTM/ASP.NET MVC 2 AutoMetaData.zip Mr. Incredible: Of course I have a secret identity. I don't know a single superhero who doesn't. Who wants the pressure of being super all the time?

    Read the article

  • ARTS Reference Model for Retail

    - by Sanjeev Sharma
    Consider a hypothetical scenario where you have been tasked to set up retail operations for a electronic goods or daily consumables or a luxury brand etc. It is very likely you will be faced with the following questions: What are the essential business capabilities that you must have in place?  What are the essential business activities under-pinning each of the business capabilities, identified in Step 1? What are the set of steps that you need to perform to execute each of the business activities, identified in Step 2? Answers to the above will drive your investments in software and hardware to enable the core retail operations. More importantly, the choices you make in responding to the above questions will several implications in the short-run and in the long-run. In the short-term, you will incur the time and cost of defining your technology requirements, procuring the software/hardware components and getting them up and running. In the long-term, as you grow in operations organically or through M&A, partnerships and franchiser business models  you will invariably need to make more technology investments to manage the greater complexity (scale and scope) of business operations.  "As new software applications, such as time & attendance, labor scheduling, and POS transactions, just to mention a few, are introduced into the store environment, it takes a disproportionate amount of time and effort to integrate them with existing store applications. These integration projects can add up to 50 percent to the time needed to implement a new software application and contribute significantly to the cost of the overall project, particularly if a systems integrator is called in. This has been the reality that all retailers have had to live with over the last two decades. The effect of the environment has not only been to increase costs, but also to limit retailers' ability to implement change and the speed with which they can do so." (excerpt taken from here) Now, one would think a lot of retailers would have already gone through the pain of finding answers to these questions, so why re-invent the wheel? Precisely so, a major effort began almost 17 years ago in the retail industry to make it less expensive and less difficult to deploy new technology in stores and at the retail enterprise level. This effort is called the Association for Retail Technology Standards (ARTS). Without standards such as those defined by ARTS, you would very likely end up experiencing the following: Increased Time and Cost due to resource wastage arising from re-inventing the wheel i.e. re-creating vanilla processes from scratch, and incurring, otherwise avoidable, mistakes and errors by ignoring experience of others Sub-optimal Process Efficiency due to narrow, isolated view of processes thereby ignoring process inter-dependencies i.e. optimizing parts but not the whole, and resulting in lack of transparency and inter-departmental finger-pointing Embracing ARTS standards as a blue-print for establishing or managing or streamlining your retail operations can benefit you in the following ways: Improved Time-to-Market from parity with industry best-practice processes e.g. ARTS, thus avoiding “reinventing the wheel” for common retail processes and focusing more on customizing processes for differentiations, and lowering integration complexity and risk with a standardized vocabulary for exchange between internal and external i.e. partner systems Lower Operating Costs by embracing the ARTS enterprise-wide process reference model for developing and streamlining retail operations holistically instead of a narrow, silo-ed view, and  procuring IT systems in compliance with ARTS thus avoiding IT budget marginalization While parity with industry standards such as ARTS business process model by itself does not create a differentiation, it does however provide a higher starting point for bridging the strategy-execution gap in setting up and improving retail operations.

    Read the article

  • Could not start ZK at requested port of 2181, while export HBASE_MANAGES_ZK=false

    - by utrecht
    Problem The first aim was to run HBase standalone. Navigating to ip:60010/master-status is succesfull once HBase has been started. The second aim is to run a distinct ZooKeeper quorum. ZooKeeper has been downloaded and has been started: netstat -nato | grep 2181 tcp 0 0 :::2181 :::* LISTEN off (0.00/0/0) The conf/hbase-env.sh was changed as follows: # Tell HBase whether it should manage it's own instance of Zookeeper or not. export HBASE_MANAGES_ZK=false in order to avoid HBase starts ZooKeeper once HBase has been started. However, the following error occurs once HBase has been started. Could not start ZK at requested port of 2181. ZK was started at port: 2182. Aborting as clients (e.g. shell) will not be able to find this ZK quorum. Question How to disable the startup of ZooKeeper by HBase and run ZooKeeper separately?

    Read the article

  • mongoexport csv output array values

    - by 9point6
    I'm using mongoexport to export some collections into CSV files, however when I try to target fields which are members of an array I cannot get it to export correctly. command I'm using: mongoexport -d db -c collection -fieldFile fields.txt --csv > out.csv and the contents of fields.txt is similar to id name address[0].line1 address[0].line2 address[0].city address[0].country address[0].postcode where the BSON data would be: { "id": 1, "name": "example", "address": [ { "line1": "flat 123", "line2": "123 Fake St.", "city": "London", "country": "England", "postcode": "N1 1AA" } ] } what is the correct syntax for exporting the contents of an array?

    Read the article

< Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >