Search Results

Search found 21098 results on 844 pages for 'model import'.

Page 7/844 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • php import larg table to phpmyadmin database

    - by safaali
    hi, I am so worry :( I dropped one of the tables from the database accidentally. fortunately, I have back-up. (I have used the "Auto backup for mysql") The back-up of the table is stored as .txt file (56 Megabytes) on my PC. I tried to import it by PhpMyAdmin and the import failed because the file is too large to import. then I uploaded the file to /home/tablebk directory. I have some experiences in php. I know that I would import it with this code, but i don't know the sql statment for this import. what is have to put as $line variable? please help me :( :( <?php $dbhost = 'localhost'; $dbuser = 'mysite'; $dbpw = 'password'; $dbname = 'databasename'; $file = @fopen('country.txt', 'r'); if ($file) { while (!feof($file)) { $line = trim(fgets($file)); $flag = mysql_query($line); if (isset($flag)) { echo 'Insert Successfully<br />'; } else { echo mysql_error() . '<br/>'; } flush(); } fclose($file); } echo '<br />End of File'; ?>

    Read the article

  • Win32 C++ Import path based on OS?

    - by Zenox
    I'm working with some legacy code that has an import like so: #import "C:\Program Files\Common Files\System\ado\msado15.dll" rename("EOF", "EndOfFile") The problem is, on a x64 machine the path for this import is in the 'Program Files (x86)' directory. Is there a preprocessor macro I can wrap around this to make it work on either? Edit: I think I found it. _M_X64, but im not 100% sure if this is correct.

    Read the article

  • How do I import Amazon MP3s with Banshee and the new Amazon Cloud Player?

    - by adempewolff
    Banshee's Amazon MP3 Import extension until recently allowed seamless importing of songs purchased from Amazon MP3. It did this by a)opening .amz files and using them to connect to and download the purchased files from Amazon's servers, and b) using hooks in Banshee's built-in browser to automatically recognize and open the .amz files when clicked on in the browser. However, recently this functionality stopped working. Banshee will display Contacting Server in the lower left hand corner for a little while and then stop. Furthermore opening the Amazon Cloud Player in the Banshee browser or any other browser on a Linux system to manually download the .amz file now results in the message: On Linux systems, Cloud Player only supports downloading songs one at a time. To download your music, deselect all checkboxes, select the checkbox for the song you want to download, then click the "Download" button. How can I get around this and import my purchased music into Banshee as I used to?

    Read the article

  • How to import a pdf in libreoffice? under ubuntu, all pages are blank

    - by Daniele
    I have some .pdf generated by a scanner, that I want to import in LibreOffice and do some small editing. The PDF has only one object per page, a page-size image. If I open it in LibreOffice under Ubuntu 12.10, it imports "successfully" but all pages are blank. I have the libreoffice-pdfimport package installed. That is true with both LibreOffice 3.6 (part of Ubuntu 12.10) and with 4.0.2, from libreoffice ppa. The same .pdf files open perfectly fine on both LibreOffice for Windows and LibreOffice for Mac (yes, I have three computers with all three OSes), but on Ubuntu 12.10, all pages are blank, so I can only conclude this is an issue with Ubuntu packaging, or something really weird prevents it from working under linux. How can I import these kinds of .pdf into LibreOffice for editing?

    Read the article

  • Passing variables from Model to Model in codeigniter

    - by Craig Ward
    Hi, I need to pass a variable to model, that model needs to send another back and use that variable to query a different model. EG: I have a product_ID which I send to the product model, From that I find out the supplier_ID. I want to grab that supplier_ID to the supplier model to get the supplier name. How do you implement this in codeigniter?

    Read the article

  • Set CSV import default to UTF-8 in Calc

    - by picca
    Every time I open a CSV (comma separated values) document in OpenOffice.org Calc I get a dialog with CSV preferences. The current default character set is "Eastern Europe (ISO-8859-2)". I would like "UTF-8" to be selected by default instead.

    Read the article

  • Import Firefox passwords into KeePassX or KeePass2

    - by rubo77
    I have an XML export of my Firefox Passwords in the form (I replaced real passwords with *): <xml> <entries ext="Password Exporter" extxmlversion="1.1" type="saved" encrypt="false"> <entry host="chrome://weave" user="****" password="****" formSubmitURL="" httpRealm="Mozilla Services Password" userFieldName="" passFieldName=""/> <entry host="chrome://weave" user="****" password="****" formSubmitURL="" httpRealm="Mozilla Services Encryption Passphrase" userFieldName="" passFieldName=""/> <entry host="http://www.example.de" user="rubo77" password="****" formSubmitURL="http://www.example.de" httpRealm="" userFieldName="benutzername" passFieldName="passwort"/> <entry host="http://example2.de" user="qqq" password="pppp" formSubmitURL="http://example2.de" httpRealm="" userFieldName="username" passFieldName="pass"/> ... Can I somehow convert this into a form KeePassX understands?

    Read the article

  • Excel CSV import treating quoted strings of numbers as numeric values, not strings

    - by MichaelOryl
    I've got a web application that is exporting its data to a CSV file. Here's one example row of the CSV file in question: 28,"65154",02/21/2013 00:00,"false","0316295","8316012,8315844","MALE" Since I can't post an image, I'll have to explain the results in Excel. The "0316295" field gets turned into a number and the leading 0 goes away. The "8316012,8315844" gets interpreted as one single number: 83,160,128,315,844. That is, most obviously, not the intended result. I've seen people recommend a leading single quote for such cases, but that doesn't really work either. 28,"65154",02/21/2013 00:00,"false","'0316295","'8316012,8315844","MALE" The single quote is visible at all times in the cell in Excel, though if I enter a number with a leading single quote myself, it shows just the intended string and not the single quote with the string. Importing is not the same as typing, it seems. Anybody have a solution here?

    Read the article

  • Excel 2007 save import steps on csv file?

    - by Chris Marisic
    I have a csv file that constantly needs opened into Excel and then have the data copied over to a separate workbook. I find the process of having to click through all of the dialogs, setting the text identifier, setting the columns to all be text extremely tedious. In many actions with data like this in regards to MSSQL or Access the program will ask you if you wish to save these steps however Excel doesn't readily ask that. Is there any way to get a comparable usage with Excel?

    Read the article

  • Import data in Excel that doesn't have a row delimiter, but number of columns is known

    - by Alex B
    So i have this text file that looks something like this: Header1 Header2 Header3 Header4 A1 B1 C1 D1 A2 B2 C2 D2 and so on. When imported, I'd want the data to format itself in 4 columns. I tried the Get External Data from Text, and it successfully imports it, but it doesn't wrap it around, so it just keeps making columns for every space. I'd want it to go on the next line after 4 (in this case) elements have been added. What's the simplest way to achieve this? EDIT: My answer follows, since I'm not yet allowed to answer my own questions yet. The Excel function I needed is called indirect(). Not sure how it actually works though, so hopefully someone can help out with that, but the function call that worked for me is =INDIRECT(ADDRESS((ROW(A1)-1)*4+COLUMN(A1),1)) which i found over here: http://www.ozgrid.com/forum/showthread.php?t=101584&p=456031#post456031 Note: this required me to add the text to excel where i'd get this row full of columns, and then flip it so that i'd have a column full of rows.

    Read the article

  • JQuery pass model to controller

    - by slandau
    I want to pass the mvc page model back to my controller within a Javascript Object. How would I do that? var urlString = "<%= System.Web.VirtualPathUtility.ToAbsolute("~/mvc/Indications.cfc/ExportToExcel")%>"; var jsonNickname = { model: Model, viewName: "<%= VirtualPathUtility.ToAbsolute("~/Views/Indications/TermSheetViews/Swap/CashFlows.aspx")%>", fileName: 'Cashflows.xls' } $.ajax({ type: "POST", url: urlString, data: jsonNickname, async: false, success: function (data) { $('#termSheetPrinted').append(data); } }); So where it says model: Model, I want the Model to be the actual page model that I declare at the top of the page: Inherits="System.Web.Mvc.ViewPage<Chatham.Web.Models.Indications.SwapModel>" How can I do that?

    Read the article

  • How should I architect my Model and Data Access layer objects in my website?

    - by Robin Winslow
    I've been tasked with designing Data layer for a website at work, and I am very interested in architecture of code for the best flexibility, maintainability and readability. I am generally acutely aware of the value in completely separating out my actual Models from the Data Access layer, so that the Models are completely naive when it comes to Data Access. And in this case it's particularly useful to do this as the Models may be built from the Database or may be built from a Soap web service. So it seems to me to make sense to have Factories in my data access layer which create Model objects. So here's what I have so far (in my made-up pseudocode): class DataAccess.ProductsFromXml extends DataAccess.ProductFactory {} class DataAccess.ProductsFromDatabase extends DataAccess.ProductFactory {} These then get used in the controller in a fashion similar to the following: var xmlProductCreator = DataAccess.ProductsFromXml(xmlDataProvider); var databaseProductCreator = DataAccess.ProductsFromXml(xmlDataProvider); // Returns array of Product model objects var XmlProducts = databaseProductCreator.Products(); // Returns array of Product model objects var DbProducts = xmlProductCreator.Products(); So my question is, is this a good structure for my Data Access layer? Is it a good idea to use a Factory for building my Model objects from the data? Do you think I've misunderstood something? And are there any general patterns I should read up on for how to write my data access objects to create my Model objects?

    Read the article

  • How do you formulate the Domain Model in Domain Driven Design properly (Bounded Contexts, Domains)?

    - by lko
    Say you have a few applications which deal with a few different Core Domains. The examples are made up and it's hard to put a real example with meaningful data together (concisely). In Domain Driven Design (DDD) when you start looking at Bounded Contexts and Domains/Sub Domains, it says that a Bounded Context is a "phase" in a lifecycle. An example of Context here would be within an ecommerce system. Although you could model this as a single system, it would also warrant splitting into separate Contexts. Each of these areas within the application have their own Ubiquitous Language, their own Model, and a way to talk to other Bounded Contexts to obtain the information they need. The Core, Sub, and Generic Domains are the area of expertise and can be numerous in complex applications. Say there is a long process dealing with an Entity for example a Book in a core domain. Now looking at the Bounded Contexts there can be a number of phases in the books life-cycle. Say outline, creation, correction, publish, sale phases. Now imagine a second core domain, perhaps a store domain. The publisher has its own branch of stores to sell books. The store can have a number of Bounded Contexts (life-cycle phases) for example a "Stock" or "Inventory" context. In the first domain there is probably a Book database table with basically just an ID to track the different book Entities in the different life-cycles. Now suppose you have 10+ supporting domains e.g. Users, Catalogs, Inventory, .. (hard to think of relevant examples). For example a DomainModel for the Book Outline phase, the Creation phase, Correction phase, Publish phase, Sale phase. Then for the Store core domain it probably has a number of life-cycle phases. public class BookId : Entity { public long Id { get; set; } } In the creation phase (Bounded Context) the book could be a simple class. public class Book : BookId { public string Title { get; set; } public List<string> Chapters { get; set; } //... } Whereas in the publish phase (Bounded Context) it would have all the text, release date etc. public class Book : BookId { public DateTime ReleaseDate { get; set; } //... } The immediate benefit I can see in separating by "life-cycle phase" is that it's a great way to separate business logic so there aren't mammoth all-encompassing Entities nor Domain Services. A problem I have is figuring out how to concretely define the rules to the physical layout of the Domain Model. A. Does the Domain Model get "modeled" so there are as many bounded contexts (separate projects etc.) as there are life-cycle phases across the core domains in a complex application? Edit: Answer to A. Yes, according to the answer by Alexey Zimarev there should be an entire "Domain" for each bounded context. B. Is the Domain Model typically arranged by Bounded Contexts (or Domains, or both)? Edit: Answer to B. Each Bounded Context should have its own complete "Domain" (Service/Entities/VO's/Repositories) C. Does it mean there can easily be 10's of "segregated" Domain Models and multiple projects can use it (the Entities/Value Objects)? Edit: Answer to C. There is a complete "Domain" for each Bounded Context and the Domain Model (Entity/VO layer/project) isn't "used" by the other Bounded Contexts directly, only via chosen paths (i.e. via Domain Events). The part that I am trying to figure out is how the Domain Model is actually implemented once you start to figure out your Bounded Contexts and Core/Sub Domains, particularly in complex applications. The goal is to establish the definitions which can help to separate Entities between the Bounded Contexts and Domains.

    Read the article

  • What is a good strategy for binding view objects to model objects in C++?

    - by B.J.
    Imagine I have a rich data model that is represented by a hierarchy of objects. I also have a view hierarchy with views that can extract required data from model objects and display the data (and allow the user to manipulate the data). Actually, there could be multiple view hierarchies that can represent and manipulate the model (e.g. an overview-detail view and a direct manipulation view). My current approach for this is for the controller layer to store a reference to the underlying model object in the View object. The view object can then get the current data from the model for display, and can send the model object messages to update the data. View objects are effectively observers of the model objects and the model objects broadcast notifications when properties change. This approach allows all the views to update simultaneously when any view changes the model. Implemented carefully, this all works. However, it does require a lot of work to ensure that no view or model objects hold any stale references to model objects. The user can delete model objects or sub-hierarchies of the model at any time. Ensuring that all the view objects that hold references to the model objects that have been deleted is time-consuming and difficult. It feels like the approach I have been taking is not especially clean; while I don't want to have to have explicit code in the controller layer for mediating the communication between the views and the model, it seems like there must be a better (implicit) approach for establishing bindings between the view and the model and between related model objects. In particular, I am looking for an approach (in C++) that understands two key points: There is a many to one relationship between view and model objects If the underlying model object is destroyed, all the dependent view objects must be cleaned up so that no stale references exist While shared_ptr and weak_ptr can be used to manage the lifetimes of the underlying model objects and allows for weak references from the view to the model, they don't provide for notification of the destruction of the underlying object (they do in the sense that the use of a stale weak_ptr allows for notification), but I need an approach that notifies the dependent objects that their weak reference is going away. Can anyone suggest a good strategy to manage this?

    Read the article

  • Best Practices for High Volume CPA Import Operations with ebXML in B2B 11g

    - by Shub Lahiri, A-Team
    Background B2B 11g supports ebXML messaging protocol, where multiple CPAs can be imported via command-line utilities.  This note highlights one aspect of the best practices for import of CPA, when large numbers of CPAs in the excess of several hundreds are required to be maintained within the B2B repository. Symptoms The import of CPA usually is a 2-step process, namely creating a soa.zip file using b2bcpaimport utility based on a CPA properties file and then using b2bimport to import the b2b repository.  The commands are provided below: ant -f ant-b2b-util.xml b2bcpaimport -Dpropfile="<Path to cpp_cpa.properties>" -Dstandard=true ant -f ant-b2b-util.xml b2bimport -Dlocalfile=true -Dexportfile="<Path to soa.zip>" -Doverwrite=true Usually the first command completes fairly quickly regardless of the number of CPAs in the repository. However, as the number of trading partners within the repository goes up, the time to complete the second command could go up to ~30 secs per operation. So, this could add up to a significant amount, if there is a need to import hundreds of CPA in a production system within a limited downtime, maintenance window.  Remedy In situations, where there is a large number of entries to be imported, it is best to setup a staging environment and go through the import operation of each individual CPA in an empty repository. Since, this will be done in an empty repository, the time taken for completion should be reasonable.  After all the partner profiles have been imported, a full repository export can be taken to capture the metadata for all the entries in one file.  If this single file with all the partner entries is imported in a loaded repository, the total time taken for import of all the CPAs should see a dramatic reduction. Results Let us take a look at the numbers to see the benefit of this approach. With a pre-loaded repository of ~400 partners, the individual import time for each entry takes ~30 secs. So, if we had to import another 100 partners, the individual entries will take ~50 minutes (100 times ~30 secs). On the other hand, if we prepare the repository export file of the same 100 partners from a staging environment earlier, the import takes about ~5 mins. The total processing time for the loading of metadata, specially in a production environment, can thus be shortened by almost a factor of 10. Summary The following diagram summarizes the entire approach and process. Acknowledgements The material posted here has been compiled with the help from B2B Engineering and Product Management teams.

    Read the article

  • ASP.NET. MVC2. Entity Framework. Cannot pass primary key value back from view to [HttpPost]

    - by Paul Connolly
    I pass a ViewModel (which contains a "Person" object) from the "EditPerson" controller action into the view. When posted back from the view, the ActionResult receives all of the Person properties except the ID (which it says is zero instead of say its real integer) Can anyone tell me why? The controllers look like this: public ActionResult EditPerson(int personID) { var personToEdit = repository.GetPerson(personID); FormationViewModel vm = new FormationViewModel(); vm.Person = personToEdit; return View(vm); } [HttpPost] public ActionResult EditPerson(FormationViewModel model) <<Passes in all properties except ID { // Persistence code } The View looks like this: <%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<Afp.Models.Formation.FormationViewModel>" %> <% using (Html.BeginForm()) {% <%= Html.ValidationSummary(true) % <fieldset> <legend>Fields</legend> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.Title) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.Title) %> <%= Html.ValidationMessageFor(model => model.Person.Title) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.Forename)%> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.Forename)%> <%= Html.ValidationMessageFor(model => model.Person.Forename)%> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.Surname)%> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.Surname)%> <%= Html.ValidationMessageFor(model => model.Person.Surname)%> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.DOB) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.DOB, String.Format("{0:g}", Model.DOB)) <%= Html.ValidationMessageFor(model => model.DOB) %> </div>--%> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.Nationality)%> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.Nationality)%> <%= Html.ValidationMessageFor(model => model.Person.Nationality)%> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.Occupation)%> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.Occupation)%> <%= Html.ValidationMessageFor(model => model.Person.Occupation)%> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.CountryOfResidence)%> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.CountryOfResidence)%> <%= Html.ValidationMessageFor(model => model.Person.CountryOfResidence)%> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.PreviousNameForename)%> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.PreviousNameForename)%> <%= Html.ValidationMessageFor(model => model.Person.PreviousNameForename)%> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.PreviousSurname)%> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.PreviousSurname)%> <%= Html.ValidationMessageFor(model => model.Person.PreviousSurname)%> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Person.Email)%> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Person.Email)%> <%= Html.ValidationMessageFor(model => model.Person.Email)%> </div> <p> <input type="submit" value="Save" /> </p> </fieldset> <% } % And the Person class looks like: [MetadataType(typeof(Person_Validation))] public partial class Person { public Person() { } } [Bind(Exclude = "ID")] public class Person_Validation { public int ID { get; private set; } public string Title { get; set; } public string Forename { get; set; } public string Surname { get; set; } public System.DateTime DOB { get; set; } public string Nationality { get; set; } public string Occupation { get; set; } public string CountryOfResidence { get; set; } public string PreviousNameForename { get; set; } public string PreviousSurname { get; set; } public string Email { get; set; } } And ViewModel: public class FormationViewModel { public Company Company { get; set; } public Address RegisteredAddress { get; set; } public Person Person { get; set; } public PersonType PersonType { get; set; } public int CurrentStep { get; set; } } }

    Read the article

  • Script/tool to import series of snapshots, each being a new edition, into GIT, populating source tree?

    - by Rob
    I've developed code locally and taken a fairly regular snapshot whenever I reach a significant point in development, e.g. a working build. So I have a long-ish list of about 40 folders, each folder being a snapshot e.g. in ascending date YYYYMMDD order, e.g.:- 20100523 20100614 20100721 20100722 20100809 20100901 20101001 20101003 20101104 20101119 20101203 20101218 20110102 I'm looking for a script to import each of these snapshots into GIT. The end result being that the latest code is the same as the last snapshot, and other editions are accessible and are as numbered. Some other requirements: that the latest edition is not cumulative of the previous snapshots, i.e., files that appeared in older snapshots but which don't appear in later ones (e.g. due to refactoring etc.) should not appear in the latest edition of the code. meanwhile, there should be continuity between files that do persist between snapshots. I would like GIT to know that there are previous editions of these files and not treat them as brand new files within each edition. Some background about my aim: I need to formally revision control this work rather than keep local private snapshot copies. I plan to release this work as open source, so version controlling would be highly recommended I am evaluating some of the current popular version control systems (Subversion and GIT) BUT I definitely need a working solution in GIT as well as subversion. I'm not looking to be persuaded to use one particular tool, I need a solution for each tool I am considering. (I haved posted an answer separately for each tool so separate camps of folks who have expertise in GIT and Subversion will be able to give focused answers on one or the other). The same but separate question for Subversion: Script/tool to import series of snapshots, each being a new revision, into Subversion, populating source tree?

    Read the article

  • import csv file/excel into sql database asp.net

    - by kiev
    Hi everyone! I am starting a project with asp.net visual studio 2008 / SQL 2000 (2005 in future) using c#. The tricky part for me is that the existing DB schema changes often and the import files columns will all have to me matched up with the existing db schema since they may not be one to one match on column names. (There is a lookup table that provides the tables schema with column names I will use) I am exploring different ways to approach this, and need some expert advice. Is there any existing controls or frameworks that I can leverage to do any of this? So far I explored FileUpload .NET control, as well as some 3rd party upload controls to accomplish the upload such as SlickUpload but the files uploaded should be < 500mb Next part is reading of my csv /excel and parsing it for display to the user so they can match it with our db schema. I saw CSVReader and others but for excel its more difficult since I will need to support different versions. Essentially The user performing this import will insert and/or update several tables from this import file. There are other more advance requirements like record matching but and preview of the import records, but I wish to get through understanding how to do this first. Update: I ended up using csvReader with LumenWorks.Framework for uploading the csv files.

    Read the article

  • Oracle Data Pump import to a sql file error :ORA-31655 no data or metadata objects

    - by Francisco Quiñones
    Hello, I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window. I made the follow export : expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename and it works, I can see the file TABLESDUMP.DMP in the directory path. then when I tried to import it to a sql file: impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql the log show : ..... ORA-31655 no data or metadata objects selected for job ..... and the sql file is created empty in the directory path. I'm not DBA, I'm a Java developer , Can you help me? Thks

    Read the article

  • MYOB Service Sales Import

    - by sjw
    I have developed an export file from our Job Management system that I want to be able to import into MYOB Accounting Plus v18.5. The file is generated without issue and I have included every single field to make it easy for upload (i.e. Match All matches every field) The problem I am having is no matter what I do, I cannot get the sales to import... Every time, no matter what I do or how I create the customer card comes back with: Error -190: Customer not found. Sale invoice not imported. I have tried matching using - co./Last Name, Card ID & Record ID and every time I get the same error. I have created a single customer with a simply Co./Last Name, Card ID & Record ID and still, when I try to import using these same fields exactly matched, I get the same error...

    Read the article

  • SQL Server Import table keeping default values

    - by Chrissi
    I am importing a table from one database to another in SQL Server 2008 by right-clicking the target database and choosing Tasks Import Data... When I import the table I get the column names and types and all the data fine, but I lose the primary key, identity specifications and all the default values that were set in the source table. So now I have to set all the default values for each column again manually. Is there any way to get the default values with the import, or even after with a Query? I am VERY new to this and flailing in the dark, so forgive me if this is a really stupid question...

    Read the article

  • Flex Import Class from a Module within a sub directory

    - by Tom
    I put some modules in a module folder. How do I import classes with the import statement when I'm in a sub folder? This won't work, not like classes which are in packages. modules/SomeModule.mxml <?xml version="1.0"?> <mx:Module> <mx:Script> <![CDATA[ import Fruit.Apple; ]]> </mx:Script> </mx:Module> Directory: . |-- Fruit |-- Apple.as |-- Modules |-- SomeModule.mxml `-- application.mxml

    Read the article

  • MySQL import in phpmyadmin (CSV) chokes on quotes

    - by Andrew Swift
    I am trying to import a .csv file into a MySQL table via phpMyAdmin. The .csv file is separated by pipes, formated like this: data|d'ata|d'a"ta|dat"a| data|"da"ta|data|da't'a| dat'a|data|da"ta"|da'ta| The data contains quotes. I have no control over the format in which I recieve the data -- it is generated by a third party. The problem comes when there is a | followed by a double quote. I always get an "invalid field count in CSV input on line N" error. I am uploading the file from the import page, using Latin1, CSV, terminated by |, separated by ". I would like to just change the "enclosed by" character, but I keep getting "Invalid parameter for CSV import: Fields enclosed by". I have tried various characters with no success. How can I tell MySQL to accept this format in phpMyAdmin? Setting up these tables is the first step in writing a program that will use uploaded gzipped .csv files to maintain the catalog of an e-commerce site.

    Read the article

  • How to Import a .bak file using MySQL?

    - by user682526
    I have no knowledge on MySQL and I have received a .bak file from one of my customer and he asked me to "import" this file using MySQL. I installed MysQL Community Server free download which installs MySQL Command Line Client. Now I don't know how to Import this .bak file and can't read the necessary data that I need. I tried installation of MySQL EE free trial as well but it doesn't give me anything that I can invoke the DB GUI so I can import the .bak file either. I'm frustrated now and can't go anywhere. Can you please help ? Thanks !

    Read the article

  • Drools rules import with wildcard

    - by ZeKoU
    Hello everyone, I am working with Drools rules. Some developers have created rules which I have to put on Guvnor (rules repository) and build packages. In these rules they have import statements with wildcards, for example: import org.drools.runtime.rule.*; When I upload this on Guvnor and try to build, it tells me: Unable to introspect model for wild card imports (org.drools.runtime.rule.*). Please explicitly import each fact type you require. Is it possible to use wildcard imports in Drools rules???

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >