Search Results

Search found 9519 results on 381 pages for 'bulk import'.

Page 6/381 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • Force import module from Python standard library instead of PYTHONPATH default

    - by jrdioko
    I have a custom module in one of the directories in my PYTHONPATH with the same name as one of the standard library modules, so that when I import module_name, that module gets loaded. If I want to use the original standard library module, is there any way to force Python to import from the standard library rather than from the PYTHONPATH directory, short of renaming the custom module and changing every reference to point to the new name?

    Read the article

  • how to import .cer file by web automaticaly

    - by mo akh
    i am trying to create a web page by any code and script to download and import a .cer file into trusted root of any client automatically. for example a java script that download cer file from a directory of my webserver and import it into trusted root of client automatically Already i try it by a vbs code in local of client but now i want do this work by a web page and automatically without Members notice. thanks

    Read the article

  • Python: import the containing package

    - by guy
    In a module residing inside a package, i have the need to use a function defined within the __init__.py of that package. how can i import the package within the module that resides within the package, so i can use that function? Importing __init__ inside the module will not import the package, but instead a module named __init__, leading to two copies of things with different names... Is there a pythonic way to do this?

    Read the article

  • what does "from MODULE import _" do in python?

    - by Paul
    Hi all, In the Getting things gnome code base I stumbled upon this import statement from GTG import _ and have no idea what it means, never seen this in the documentation and a quick so / google search didn't turn anything up. Thank you all in advance Paul

    Read the article

  • Oracle 10g express edition import

    - by Jasim
    How can i import a DMP file into my oracle 10g expression edition database? I tried with imp but its showing an error: IMP-00010: not a valid export file, header failed verification IMP-00000: Import terminated unsuccessfully How can i solve this?

    Read the article

  • objective c import once

    - by joels
    I have a header file with a bunch on statics like static NSString * SOME_NAME = @"someMeaning"; What is the best way to import this? Should I define them some other way? I tried just using the #import statement but any file that imports it gives me a warning saying SOME_NAME defined but not used...

    Read the article

  • Import fixed width file to oracle

    - by Alex Blokha
    Is there an ability to import fixed width file to oracle? Preferably through .net(c#) for catching errors during import and showing them to user. P.S. File has 5 types of rows. For example 1 row has 5 columns, 2-nd has 50 columns.

    Read the article

  • Magento MAGMI: Product attributes (custom options) not showing up in import

    - by Rodgers and Hammertime
    When importing a CSV into Magento with the MAGMI importing tool, I am unable to import Custom Options (as in size: smalee/medium/large). The import manages to put in the basic products, but the Custom Options don't transfer accross. By custom options I mean the fields Title, Input Type, Is Required, Sort Order Title, Price, Price Type, SKU, Sort Order Title, Price, Price Type, SKU, Sort Order Title, Price, Price Type, SKU, Sort Order and so on ... Found in the custom options menu... Even using the example CSV from the MAGMI SourceForge Wiki: sku,name,description,price,Size:drop_down:1 T-Shirt1,T-Shirt,A T-Shirt,5.00,Small|Medium|Large T-Shirt2,T-Shirt2,Another T-Shirt,6.00,XS|S|M|L|XL ...it fails to import the attributes. So i'm simply using MAGMI with the supplied example data from SourceForge on a blank magento product list, and it doesn't transfer properly. Can anyone shed any light on what might be wrong? I am using Magento ver. 1.6.1.0 if that changes anything. Thanks.

    Read the article

  • Sharepoint: Integrity of lookup fields after a list import

    - by driAn
    Hi there I got a question about the behavior of lookup fields when importing data. I wonder how the lookup fields behave when the list they point to is being replaced/imported. To explain the issue, I will provide a quick example below: As example, assume we have these two sharepoint lists: Product Types ------------- + Type Name + Code Nr + etc Products -------- + Product Name + Product Type (Lookup field to list "Product Types") + etc In my scenario, the Products List contains production data on the production Sharepoint platform. It is filled with data by the business users. However the Product Types list contains rather static data and is maintained by the developer. Now after a development cycle, the developer wants to deploy his new webparts and his new data (product types list). The developer performs the following procedure: On the dev machine: Export "product type" list using stsadm On the production machine: Delete all items in the "product type" list On the production machine: Import the "product type" list using stsadm This means we basically replace the "product type" list on the production server while keeping the "product" list as it is. Now the question: Is this safe? Will the lookup references break under certain circumstances? Any downside of this import/export procedure? What happens if someone accesses a "product" during the import? Will the (now invalid) reference clear its own content (become a null value). What happens if the schema of the "product type" list changes (new column)? Will this cause any troubles? Thanks for all feedback and suggestions!

    Read the article

  • Svn import with auto-props & pre-commit hook

    - by James Tisato
    My company's svn repo has a lot of MS Word docs in it. We've implemented a policy that all .doc files must have the svn:needs-lock property set to prevent parallel access on files that are hard to merge (we've also done this for xls, ppt, pdf etc.). We've implemented the policy by distributing a svn config with auto-props set appropriately for all relevant document types. We've also set up a pre-commit hook that checks that all added files of these types have the needs-lock property set (i.e. if they forget/are too lazy to update their svn config file, they won't be able to add any docs to the repo). The problem I'm having, however, is that the pre-commit hook fails when users try to import files into the repo, e.g. some users like to add files directly thru TortoiseSVN's Repo Browser, which effectively is an svn import. Through testing on other file types, I have seen that doing an import does in fact apply the auto-props listed in my config, but they don't seem to be applied at the point that the pre-commit hook runs. When importing .doc files, the hook fails, saying that the needs-lock property is missing. Is there really much difference between adding a single file to a working copy and committing it vs importing a file directly? Do we need to tailor our precommit hook in some way to cater for this scenario?

    Read the article

  • Cheap Bulk Domain Registration

    - by Panoy
    I have 6-7 domain names that I have thought of and I'm planning to buy it in bulk so that I can save. Or am I wrong on this? In my case, since its my first time to this hosting/domain registration, I only knew of GoDaddy with regards to domain registration. Questions: Will I lose out if I chose a cheap domain registrar compared to one that's popular? For a newbie like me, what companies can you recommend for me to register domain names in bulk for cheap or affordable price? I notice that some prices are higher because they offer support and customer service? Aren't those servers not reliable at all? I've heard of some domain registrars that they're increasing their prices every renewal? Is that just natural in a business sense for these domain registrars? Before posting this, I've been reading about NameCheap.com, and I'm considering registering for them unless you have other good choices to give me. I'll appreciate every suggestion or advice you can give.

    Read the article

  • WARNING Retrying Bulk Insert for file:sqlldr due to Communication Error:256

    - by user702295
    WARNING Retrying Bulk Insert for file:sqlldr due to Communication Error:256 I am running my engine on Linux and am receiving an intermittent message "WARNING Retrying bulk insert for file: sqlldr due to communication Error: 256" The engine seems to have completed successfully, but it is not clear if this error caused some of the forecast to not complete. It is also not clear what caused the error. Generally if you see only the WARNING of it, it means that next retries of the same load request have eventually succeeded and so the run a a whole is not affected. In order to know more about what happens, look for .log/.bad files left in the engines bin directory or possibly a quote of them within the specific engine log that had the issue.  The sqlnet.log file may also have some information about it and perhaps at the database server side there may be some log/alert regarding what happened.  Look at the alert.log. In general it could be that the database server/network was over loaded at the time and somehow the connection was rejected/failed/aborted either due to specific setting on concurrent connections/sessions or inadvertently due to glitch in network/os/hardware. If this repeats and becomes more frequent during the run you should look further into it as mentioned above. You can also track this using either SQL*Trace or java.util.logging.  - Globally enable logging by setting the oracle.jdbc.Trace system property java -Doracle.jdbc.Trace=true - Client Side Tracing: Your SQLNET.ORA file should contain the following lines to produce a client side trace file: trace_level_client = 10 trace_unique_client = on trace_file_client = sqlnet.trc trace_directory_client = <path_to_trace_dir> Server Side Tracing: To enable server side tracing, use the following parameters: trace_level_server = 10 trace_file_server = server.trc trace_directory_server = <path_to_trace_dir> Tracing Levels: The following values can be used for TRACE_LEVEL* parameters:     16 or SUPPORT — WorldWide Customer Support trace information     10 or ADMIN — Administration trace information     4 or USER — User trace information     0 or OFF — no tracing, the default Additional information is readily available via the web.

    Read the article

  • How should I safely send bulk mail? [closed]

    - by Jerry Dodge
    First of all, we have a large software system we've developed and have a number of clients using it in their own environment. Each of them is responsible for using their own equipment and resources, we don't provide any services to share with them. We have introduced an automated email system which sends emails automatically via SMTP. Usually, it only sends around 10-20 emails a day, but it's very possible to send bulk email up to thousands of people in a single day. This of course requires a big haul of work, which isn't necessarily the problem. The issue arises when it comes to the SMTP server we're using. An email server is issued a number of relays a day, which is paid for. This isn't really necessarily the issue either. The risk is getting the email server blacklisted. It's inevitable, and we need to carefully take all this into consideration. As far as I can see, the ideal setup would be to have at least 50 IP addresses on multiple servers, each of which hosts its own SMTP server. When sending bulk email, it will divide them up across these servers, and each one will process its own queue. If one of those IP's gets blacklisted, it will be decommissioned and a new IP will replace it. Is there a better way that doesn't require us to invest in a large handful of servers? Perhaps a third party service which is meant exactly for this?

    Read the article

  • Why does unity obj import flip my x coordinate?

    - by milkplus
    When I import my wavefront obj model into unity and then draw lines over it with the same coordinates in the obj file, the x coordinate is negated. I don't see any option in the importer that might be doing that. And I'm using the same localToWorldMatrix and the same coordinate data in the .obj file. Hmmm GL.PushMatrix(); GL.MultMatrix(transform.localToWorldMatrix); CreateMaterial(); lineMaterial.SetPass(0); GL.Color(new Color(0, 1, 0)); GL.Begin(GL.LINES); GL.Vertex(p1); GL.Vertex(p2); GL.Vertex(p2); GL.Vertex(p3); //... GL.End(); GL.PopMatrix();

    Read the article

  • Import AppleScript methods in another AppleScript?

    - by DASKAjA
    Is there a way to use defined AppleScript methods in other AppleScripts which reference the original AppleScript with something similar to import (f.e. in PHP)? I wrote a methode to set Skype status and mood-text: on setSkypeStatus(status, mood_text) tell application "System Events" set skypeRunning to count (every process whose name is "Skype") if skypeRunning > 0 then --only set status if skype is running tell application "Skype" set myStatus to "SET USERSTATUS " & status set myMood to "SET PROFILE MOOD_TEXT " & mood_text send command myStatus script name "AppleScript" send command myMood script name "AppleScript" return skypeRunning end tell else return skypeRunning end if end tell end setSkypeStatus now I'm searching for something like import skype_methods.scpt. Is there such a functionality. I can't something related with Google.

    Read the article

  • "from _json import..." - python

    - by RoseOfJericho
    Hello, all. I am inspecting the JSON module of python 3.1, and am currently in /Lib/json/scanner.py. At the top of the file is the following line: from _json import make_scanner as c_make_scanner There are five .py files in the module's directory: __init__ (two leading and trailing underscores, it's formatting as bold), decoder, encoder, scanner and tool. There is no file called "json". My question is: when doing the import, where exactly is "make_scanner" coming from? Yes, I am very new to Python!

    Read the article

  • Eclipse and python: library will import in interprer, but not in IDE

    - by John
    I'm running Windows 7, Python 2.6.4 and the latest version of Eclipse. I downloaded the boto library (http://code.google.com/p/boto/) and ran python setup.py install, which created boto-1.9b-py2.6.egg in C:\Python26\Lib\site-packages. Importing a class - say, by doing 'from boto.sqs.connection import SQSConnection' - works fine from the python command line tool. But Eclipse will not find boto, despite the fact that it is using the same python interpreter as I am using when at the command line. I added the library as an external source folder, but that didn't work either. How can I properly import the boto library into Eclipse? Thanks.

    Read the article

  • Import OLE Object from Access to MySQL

    - by SecretDeveloper
    I have a table in an access table which contains Product entries, one of the columns has a jpg image stored as an OLE Object. I am trying to import this table to MySQL but nothing seems to work. I have tried the MySQL migration tool but that has a known issue with Access and OLE Objects. (The issue being it doesnt work and leaves the fields blank) I also tried the suggestion on this site and while the data is imported it seems as though the image is getting corrupted in the transfer. When i try to preview the image i just get a binary view, if i save it on disk as a jpg image and try to open it i get an error stating the image is corrupt. The images in Access are fine and can be previewed. Access is storing the data as an OLE Object and when i import it to MySql it is saved in a MediumBlob field. Has anyone had this issue before and how did they resolve it ?

    Read the article

  • Import CSV from url address and export as XML -- Rails

    - by Jeffrey
    Two questions: How can I import a file from a web address, without a form? Example: Organisation.import(:from = 'http://wufoo.com/report.csv') How can I use xml builder without pulling from the db? More Info My company uses wufoo for web forms. The data from wufoo is exported as csv files. To get the data into my company's cms, it needs to be formatted as xml. I don't need to store any of the data, aside from the url to the csv file. I thought this might work well as a simple rails app.

    Read the article

  • SQL Compact import DB from SQL Server Express with Server Management Studio

    - by Sasha
    Hi! I try to import sql script, generated with Server Management Studio, into SQL Compact 3.5 and get a lot of error. What I am doing wrong? I generate script with "Task/Generate Script" context menu. Part of my script: CREATE TABLE [LogMagazines]( [IdUser] [int] NOT NULL, [Text] [nvarchar](500) NULL, [TypeLog] [int] NOT NULL, [DateAndTime] [datetime] NOT NULL, [DetailMessage] [nvarchar](max) NULL, [Id] [int] IDENTITY(1,1) NOT NULL, CONSTRAINT [PK_LogMagazines] PRIMARY KEY ( [Id] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] Knowledge Base: http://stackoverflow.com/questions/1525063/how-to-import-data-in-sql-compact-edition http://stackoverflow.com/questions/1515969/exporting-data-in-sql-server-as-insert-into/1515975

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >