Search Results

Search found 659 results on 27 pages for 'exporting'.

Page 13/27 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • How do I programmatically apply a Drupal input filter?

    - by ford
    I am currently exporting Drupal data to an external source (XML) programmatically. However, I want the data to run through the site's default Input Format (the filter that runs before user content is displayed on the website) before being written to file. How do I programmatically apply a Drupal input filter? Is there a specific function call or hook for this purpose? If so, links/advice would be most appreciated.

    Read the article

  • Sorry, Parameter is not valid.

    - by maruthikumar
    Hello All, I'm exporting some data to PDF. The data contains one or more bitmaps. Unfortunately I'm getting the following error during the export. Sorry, Parameter is not valid. Source: at System.Drawing.Bitmap..ctor(Int32 width, Int32 height, PixelFormat format) at System.Drawing.Bitmap..ctor(Image original, Int32 width, Int32 height) Type: System.ArgumentException Any help will be highly appreciable. Regards, Maruthi Kumar

    Read the article

  • Do the changes to cpumask using sched_setaffinity() take place immediately

    - by Sukanto
    I am writing a linux kernel module that needs to pin two threads on two different cpus. I am planning to use sched_setaffinity() after exporting it in the kernel. Is there any other exported function for the same ? Also, if I set only 1 CPU in the cpumask, will the thread be moved to that cpu with immediate effect ? If not, how do I enforce the same ? Will it help to call schedule() just after sched_setaffinity() ?

    Read the article

  • Understanding NFS4 (Linux server)

    - by drumfire
    I've been a bit bothered by NFS4 on Linux. Some information 'out there' seems to conflict with other information, and other information appears hard to find. So here are a couple of things that caught my attention, hopefully someone out there can shed some light on this. This question focuses exclusively on NFS4 without Kerberos etc. 1. Exports There is ambiguous information in the exports manpage on the structure of /etc/exports. To quote from exports(5): Also, each line may have one or more specifications for default options after the path name, in the form of a dash ("-") followed by an option list. The option list is used for all subsequent exports on that line only. What does "subsequent exports on that line only" mean? 1.2 fsid=0 not required anymore? I was searching for fsid when I found a comment on the linux-nfs list stating fsid=0 is not required anymore. Now I'm just confused, do I need it with nfs4 or not?! 2. Non-exported directory still mountable Say I have the following tree: /exp /exp/users /exp/distr /exp/distr/archlinux /exp/distr/debian And I have the following entries in this fstab entry: /dev/disk/by-label/users /mnt/users ext4 defaults 0 0 /dev/disk/by-label/distr /mnt/distr ext4 defaults 0 0 /mnt/users /exp/users none bind 0 0 /mnt/distr /exp/distr none bind 0 0 And my exports is exactly this: /exp 192.168.1.0/24(fsid=0,rw,async,no_subtree_check,no_root_squash) /exp/distr 192.168.1.0/24(rw,async,no_subtree_check,no_root_squash) And exportfs -arv shows: exporting 192.168.1.0/24:/exp/distr exporting 192.168.1.0/24:/exp Then why am I able to do this and get no error on a client: mount -t nfs4 server:/exp/users /tmp/test Even though /exp/users is not exported? I didn't export this directory, and while I don't see the contents of /dev/disk/by-label/users unless I specify crossmnt, I am still able to write to the directory. Everything I write to there goes to the underlying directory of /exp/users which can be seen when I umount /exp/users; ls /exp/users.. 3. The odd case of showmount -d server As stated by rpc.mountd(8), this command should display directories that are either currently mounted by clients, or stale entries in /var/lib/nfs/rmtab, as can be read: The rpc.mountd daemon registers every successful MNT request by adding an entry to the /var/lib/nfs/rmtab file. When receivng a UMNT request from an NFS client, rpc.mountd simply removes the matching entry from /var/lib/nfs/rmtab, as long as the access control list for that export allows that sender to access the export. (...) Note, however, that there is little to guarantee that the contents of /var/lib/nfs/rmtab are accurate. A client may continue accessing an export even after invoking UMNT. If the client reboots without sending a UMNT request, stale entries remain for that client in /var/lib/nfs/rmtab. After reading this I surely wonder: Isn't it terribly insecure to just expose this type of client information; Aren't unaware server admins bound to have an rmtab with a lot of stale clients; Is this the reason that clients that mount nfs4 directories with mount -v get to see output like "nothing was mounted" even though something was mounted? I have a lot of other questions regarding nfs4, but I'll keep it at this for the moment.. :)

    Read the article

  • Win a set of Infragistics Silverlight Controls with Data Visualization!

    - by mbcrump
    Infragistics recently released their new Silverlight Data Visualization Controls. I saw a couple of samples and had to take a look. I headed over to their website and downloaded the controls. I first noticed the hospital floor-plan demo shown on their site and started thinking of ways that I could use this in my own organization. I emailed them asking if I could give away the Silverlight Data Visualization controls on my site and they said, Yes! They also wanted to throw in the standard Silverlight Line of Business controls. (combined they are worth about $3000 US). I am very thankful they were willing to help the Silverlight community with this giveaway. So some quick rules below: ----------------------------------------------------------------------------------------------------------------------------------------------------------- Win a FREE developer’s license of Infragistics Silverlight Controls with Data Visualization ($3000 Value) Random winner will be announced on January 1st, 2011! To be entered into the contest do the following things: Subscribe to my feed. Leave a comment below with a valid email account (I WILL NOT share this info with anyone.) For extra entries simply: Retweet a link to this page using the following URL [ http://mcrump.me/iscfree ]. It does not matter what the tweet says, just as long as the URL is the same. Unlimited tweets, but please don’t go crazy! This URL will allow me to track the users that Tweet this page. Don’t forget to visit Infragistics because they made this possible. ---------------------------------------------------------------------------------------------------------------------------------------------------------- Before we get started with the Silverlight Controls, here is a couple of links to bookmark: The Silverlight Line of Business Control page is here. You can also check out the live demos here. The Data Visualization page is here. You can also check out the live demos here. Don’t worry about the Samples/Help Documentation. You can install all of that to your local HDD when you are installing it. I am going to walk you through the Silverlight Controls recently released by Infragistics. Begin by downloading the trial version and running the executable. If you downloaded the Complete bundle then you will have the following options to pick from. I like having help documentation and samples on my local HDD in case I do not have access to the internet and want to code. After it is installed, you may want to take a look at your Toolbox in Visual Studio 2010. Look for NetAdvantage 10.3 Silverlight and you will see that you now have access to all of these controls. At this point, to use the controls it’s as simple as drag/drop onto your Silverlight container. It will create the proper Namespaces for you. I wanted to highlight a few of the controls that I liked the most: Grid – After using the Infragistics grid you will wonder how you ever survived using the grid supplied by Microsoft standard controls.  This grid was designed to get your application up and running very fast. It’s simple to bind, it handles LARGE DataSets, easy to filter and allows endless possibilities of formatting data. The screenshot below is an example of the grid. For a real-time updating demo click here. SpellChecker- If your users are creating emails or performing any other function that requires Spell Checking then this control is great. Check out the screenshots below: In this first screen, I have a word that is not in the dictionary [DotNet]. The Spell Checker finds the word and allows the user to correct it. What is so great about Infragistics controls is that it only takes a few lines of code to have a full-featured Spell Checker in your application. TagCloud – This is a control that I haven’t seen anywhere else. It allows you to create keywords for popular search terms. This is very similar to TagCloud seen all over the internet.  Below is a screenshot that shows “Facebook” being a very popular item in the cloud. You can link these items to a hyperlink if you wanted. Importing/Exporting from Excel – I work with data a majority of the time. We all know the importance of Excel in our organizations, its used a lot. With Infragistics controls it make importing and exporting data from a Grid into Excel a snap. One of the things that I liked most about this control was the option to choose the Excel format (2003 or 2007). I haven’t seen this feature in other controls. Creating/Saving/Extracting/Uploading Zip Files – This is another control that I haven’t seen many others making. It allows you to basically manipulate a zip file in any way you like. You can even create a password on the zip file. Schedule – The Schedule that Infragistics provides resembles Outlook’s calendar. I think that it’s important for a user to see your app for the first time and immediately be able to start using because they are already familiar with the UI. The Schedule control accomplishes that in my opinion. I have just barely scratched the surface with the Infragistics Silverlight Line of Business controls. To check all of them then click here. A quick thing to note is that this giveaway also comes with the following Silverlight Data Visualization Controls. Below is a screenshot that list all of them:   I wanted to highlight 2 of the controls that I liked the most: xamBarcode– The xamBarcode supports the following Symbologies: Below is an example of the barcode generated by Infragistics controls. This is a high resolution barcode that you will not have to wonder if your scanner can read it. As long as you have ink in your printer your barcode will read it. I used a Symbol barcode reader to test this barcode. xamTreemap– I’ve never seen a way of displaying data like this before, but I like it. You can style this anyway that you like of course and it also comes with an Office 2010 Theme. Thanks to Infragistics for providing the controls to one lucky reader. I hope that you enjoyed this post and good luck to those that entered the contest.  Subscribe to my feed

    Read the article

  • sybase bcp import fails

    - by chromeplatedbanana
    We're trying to export some tables from our production database to our test database using bcp. The bcp export seems to work fine, but the import always fails with a data type error (see below). We tested on our test database exporting the table content to a file, then importing it in again immediately, but that failed too. e.g., bcp TABLENAME out ~/tempfile -S servername -U username generates a file as expected. If we use -c option then the number of lines is as expected. However, bcp TABLENAME in ~/tempfile -S servername -U username fails with CTLIB Message: - L0/0D/S0/N0/0/0: blk_int(): blk_layer: CT library error: Cannot find an equivalent CS_TYPE for this TDS data type 49 blk_init failed. We get this whenever we try to copy into TABLENAME, whether from the production or test table dump file. I don't understand why export and import for the same TABLENAME is generating a data type error. What am I doing wrong here? Thanks

    Read the article

  • Import IIS6 website configuration into IIS7

    - by sinni800
    Hello, I have many websites hosted on IIS6 and I want to migrate them to IIS7. It is enough if the basic configuration (folder, virtual folders inside, hostheaders, ) is migrated. a great part of the configuration is in web.config anyway. It is even okay if they're just created as "classic" mode applications. I have tried the following things: Msdeploy. This copies the whole directoried though, not good... Tried exporting the iis websites to xml... Found nothing to give them to iis7... Anybody got an idea?

    Read the article

  • Import IIS6 website configuration into IIS7

    - by sinni800
    Hello, I have many websites hosted on IIS6 and I want to migrate them to IIS7. It is enough if the basic configuration (folder, virtual folders inside, hostheaders, ) is migrated. a great part of the configuration is in web.config anyway. It is even okay if they're just created as "classic" mode applications. I have tried the following things: Msdeploy. This copies the whole directoried though, not good... Tried exporting the iis websites to xml... Found nothing to give them to iis7... Anybody got an idea?

    Read the article

  • [Visual Studio Extension Of The Day] Test Scribe for Visual Studio Ultimate 2010 and Test Professional 2010

    - by Hosam Kamel
      Test Scribe is a documentation power tool designed to construct documents directly from the TFS for test plan and test run artifacts for the purpose of discussion, reporting etc... . Known Issues/Limitations Customizing the generated report by changing the template, adding comments, including attachments etc… is not supported While opening a test plan summary document in  Office 2007, if you get the warning: “The file Test Plan Summary cannot be opened because there are problems with the contents” (with Details: ‘The file is corrupt and cannot be opened’), click ‘OK’. Then, click ‘Yes’ to recover the contents of the document. This will then open the document in Office 2007. The same problem is not found in Office 2010. Generated documents are stored by default in the “My documents” folder. The output path of the generated report cannot be modified. Exporting word documents for individual test suites or test cases in a test plan is not supported. Download it from Visual Studio Extension Manager Originally posted at "Hosam Kamel| Developer & Platform Evangelist" http://blogs.msdn.com/hkamel

    Read the article

  • VAMT 3.0 Proxy Activate - No ‘Apply Confirmation ID’ option at all

    - by lez
    I tried to activate my windows box in isolated network zone, so I followed the process of 'Scenario 2: Proxy Activation' in http://technet.microsoft.com/en-us/library/hh825202.aspx using two VAMT 3.0 hosts. Everything went fine (actually I'm not sure what option to choose when exporting VAMT data to .cilx file, I tried 'Export products and product keys' and 'Export proxy activation data only' anyway, is this a cause of this problem, I have no idea), until I wanted to apply the CID and activate the isolated pc. In 'Activate', there is no 'Apply Confirmation ID' option!, its only options are 'Acquire and save confirmation ID only' and 'Acquire confirmation ID, apply to selected machine(s) and activate'. The error message is 'cannot resolve remote name 'go.microsoft.com'' when I chose any of them, looks like acquire confirmation id always need to go to this url. But I just want to apply cid... Has anyone run into this, please? I searched internet, seems no answer... Any suggestion would be appreciated, thank you!

    Read the article

  • How do I create statistics to make ‘small’ objects appear ‘large’ to the Optmizer?

    - by Maria Colgan
    I recently spoke with a customer who has a development environment that is a tiny fraction of the size of their production environment. His team has been tasked with identifying problem SQL statements in this development environment before new code is released into production. The problem is the objects in the development environment are so small, the execution plans selected in the development environment rarely reflects what actually happens in production. To ensure the development environment accurately reflects production, in the eyes of the Optimizer, the statistics used in the development environment must be the same as the statistics used in production. This can be achieved by exporting the statistics from production and import them into the development environment. Even though the underlying objects are a fraction of the size of production, the Optimizer will see them as the same size and treat them the same way as it would in production. Below are the necessary steps to achieve this in their environment. I am using the SH sample schema as the application schema who's statistics we want to move from production to development. Step 1. Create a staging table, in the production environment, where the statistics can be stored Step 2. Export the statistics for the application schema, from the data dictionary in production, into the staging table Step 3. Create an Oracle directory on the production system where the export of the staging table will reside and grant the SH user the necessary privileges on it. Step 4. Export the staging table from production using data pump export Step 5. Copy the dump file containing the stating table from production to development Step 6. Create an Oracle directory on the development system where the export of the staging table resides and grant the SH user the necessary privileges on it.  Step 7. Import the staging table into the development environment using data pump import Step 8. Import the statistics from the staging table into the dictionary in the development environment. You can get a copy of the script I used to generate this post here. +Maria Colgan

    Read the article

  • Zipcodes in CSV Generation

    - by BRADINO
    When exporting to CSV format, then opening in a spreadsheet program like Excel zipcodes that start with a zero or zeroes have the preceding zeros stripped off. Obviously it is because the spreadsheet sees that column as integers and preceding zeros in integers are useless. A quick and dirty trick to force Excel (hopefully you are using OpenOffice) to display the full zipcode, we wrap it in double quotes and put an equal sign in front of it, to force it to be a string like this: $zipcode = 00123; $data = '="' . $zipcode . '"' ; So if you are doing the straight query to CSV export, using the fputcsv function it would look something like this. Basically just overwrite the value in the row and then continue along. while ($row = mysql_fetch_assoc($query)){         $row['zipcode'] = '="'.$row['zipcode'].'"';     fputcsv($output, $row); } php csv zipcode csv number csv force string

    Read the article

  • Can't restore backup from SQL Server 2008 R2 to SQL Server 2005 or 2008

    - by Erick
    Hi everyone, I'm trying to get a backup from SQL Server 2008 R2 restored to SQL Server 2008, but when we try to do the restore we get this: The database was backed up on a server running version 10.50.1092. That version is incompatible with this server, which is running version 10.00.2531. Either restore the database on a server that supports the backup, or use a backup that is compatible with this server. I can use the script wizard to generate a script, but that takes over an hour to run. I also tried just exporting the data from server to server, but it had issues with the primary keys/identity columns. I will be running into this issue with several other clients so any help you could offer about how to get around this would be great. Thanks for your help!

    Read the article

  • Free .NET Training at DevCare in Dallas...

    - by [email protected]
    Come take an early look at the debugging experience in VS 2010 this Friday (3/25/2010) at TekFocus in Dallas, at the InfoMart, at 9 AM: In this session, we’ll … Dive deep into the new IntelliTrace (formerly, historical debugging) feature, which enables you to step back in time within your debugging session and inspect or re-execute code, without having to restart your application See how to manage large numbers of breakpoints with labeling, searching and filtering Extend “data tips” by adding comments, notes and strategically “pinning” these resources to maintain their visibility throughout your session Demonstrate “collaborative debugging,“ by debugging a portion of an application and then exporting breakpoints and labeled data tips, so that others can leverage your effort, without having to start over Leverage these new debugging features in applications built in earlier versions of the .NET Framework through the MultiTargeting features available in VS 2010 You’ll walk-away with a clear understanding of how you can use this upcoming technology to vastly increase your productivity and build better software.Register to attend ==>  http://www.dallasdevcares.com/upcoming-sessions/ DevCares is a monthly series of FREE half-day events sponsored by TekFocus and Microsoft. Targeted specifically at developers, the content is presented by experts on a variety of .NET topics. These briefings include expert testimonials, working demos and sample code designed to help you get the most out of application development with .NET. Events are held on the last Friday of each month at the TekFocus offices in the Infomart near downtown Dallas.TekFocus is a full-service technology training provider with a core business delivering Microsoft-certified technical training and product skills enhancements to customers worldwide    

    Read the article

  • How to create an NFS proxy by using kernel server & client?

    - by Martin C. Martin
    I have a file server that exports as NFS. On an Ubuntu machine I mount that, then try to export it as an NFS volume. When I go to export it, I get the message: exportfs: /test/nfs-mount-point does not support NFS export How can I get this to work, or at least get more information as to what the problem is? Exact steps: Unbuntu 12.04 mount -f nfs myfileserver.com:/server-dir /test/nfs-mount-point [Works fine, I can read & write files] /etc/exports contains: /test/nfs-mount-point *(rw,no_subtree_check) sudo /etc/init.d/nfs-kernel-server restart Stopping NFS kernel daemon [ OK ] Unexporting directories for NFS kernel daemon... [ OK ] Exporting directories for NFS kernel daemon... exportfs: /test/nfs-mount-point does not support NFS export [ OK ] Starting NFS kernel daemon [ OK ]

    Read the article

  • New Release: ImageGlue 7.0 .NET

    When it comes to manipulating images dynamically there are few toolkits that can compete with ImageGlue 6 in terms of versatility and performance. With extensive support for a huge range of graphic formats including JPEG2000, Very Large TIFF Support™, and fully multi-threaded processing, ImageGlue has proved a popular choice for use in ASP and ASP.NET server environments. Now ImageGlue 7 has arrived, introducing support for 64-bit systems, improved PostScript handling, and many other enhancements. We've also used the opportunity to revise the API, to make it more friendly and familiar to .NET coders. But don't worry about rewriting legacy code - you'll find the 'string parameter' interface is still available through the WebSupergoo.ImageGlue6 namespace. So what's new in ImageGlue 7.0? Support for 64-bit systems. ImageGlue now incorporates the PostScript rendering engine as used by ABCpdf, our PDF component, which has proven to be fast, robust and accurate. This greatly improves support for importing and exporting PS, EPS, and PDF files, and also enables you to make use of powerful PostScript drawing operations for drawing to canvas. Leveraging ABCpdf's powerful vector graphics import and export functionality also makes it possible to interoperate with XPS and MS Office documents. An improved API with new classes, methods and properties, more in keeping with normal .NET development. Plus of course the usual range of bug fixes and minor enhancements. span.fullpost {display:none;}

    Read the article

  • Additional options in MDL

    - by Jane Zhang
        The Metadata Loader(MDL) enables you to populate a new repository as well as transfer, update, or restore a backup of existing repository metadata. It consists of two utilities: metadata export and metadata import. The export utility extracts metadata objects from a repository and writes the information into a file. The import utility reads the metadata information from an exported file and inserts the metadata objects into a repository.      While the Design Client provides an intuitive UI that helps you perform the most commonly used export and import tasks, OMBPlus scripting enables you to specify some additional options, and manage a control file that allows you to perform more specialized export and import tasks. Is it possible to utilize these options in MDL from Design Client? This article will tell you how to achieve it.      A property file named mdl.properties is used to configure the additional options. It stores options in name/value pairs. This file can be created and placed under the directory <owb installation path>/owb/bin/admin/. Below we will introduce the options that can be specified in the mdl.properties file. 1. DEFAULTDIRECTORY     When we open a Metadata Export/Import dialog in Design Client, a default directory is provided for MDL file and log file. For MDL Export, the default directory is <owb installation path>/owb/bin/. As for MDL Import, the default directory is <owb installation path>/owb/mdl/. It may not be the one you would want to use as a default. You can specify the option DEFAULTDIRECTORY in the mdl.properties file to set your own default directory for MDL Export/Import, for example, DEFAULTDIRECOTRY=/tmp/     In this example, the default directory is set to /tmp/. Be sure the value ends with a file separator since it represents a directory. In Windows, the file separator is “\”. In linux, the file separator is “/”. 2. MDLTRACEFILE     Sometimes we would like to trace the whole process of MDL Export/Import, and get detailed information about operations to help developers or supports troubleshooting. To turn on MDL trace, set the option MDLTRACEFILE in the mdl.properties file. MDLTRACEFILE=/tmp/mdl.trc    The right side of the equals sign is to specify the name of the file for MDL trace information to be written. If no path is specified, the file will be placed under directory <owb installation path>/owb/bin/admin/. However, the trace file may be large if the MDL file contains a large number of metadata objects, so please use this option sparingly. 3. CONTROLFILE       We can use a control file to specify how objects are imported or exported. We can set an option called CONTROLFILE in the mdl.properties file, so the control file can also be utilized in Design Client, for example, CONTROLFILE=/tmp/mdl_control_file.ctl     The control file stores options in name/value pairs. When using control file, be sure the file exists, otherwise an exception java.lang.Exception: CNV0002-0031(ERROR): Cannot find specified file will be thrown out during MDL Export/Import.      Next we will introduce some options specified in control file. ZIPFILEFORMAT     By default, MDL exports objects into a zip format file. This zip file has an .mdl extension and contains two files. For example, you export the repository metadata into a file called projects.mdl. When you unzip this MDL file, you obtain two files. The file projects.mdx contains the repository objects. The file mdlcatalog.xml contains internal information about the MDL XML file. Another choice is to combine these two files into one unzip text format file when doing MDL exporting.    In OMBPlus command related to MDL, there is an option called FILE_FORMAT which is used to specify the file format for the exported file. Its acceptable values are ZIP or TEXT. When the value TEXT is selected, the exported file is in text format, for example, OMBEXPORT MDL_FILE '/tmp/options_file_format_test.mdl' FILE_FORMAT TEXT FROM PROJECT 'MY_PROJECT'    How to achieve this via Design Client when doing an MDL exporting? Here we have another option called ZIPFILEFORMAT which has the same function as the FILE_FORMAT. The difference is the acceptable values for ZIPFILEFORMAT are Y or N. When the value is set to N, the exported file is in text format, otherwise it is in zip file format. LOGMESSAGELEVEL     Whenever you export or import repository metadata, MDL writes diagnostic and statistical information to a log file. Their are 3 types of status messages: Informational, Warning and Error. By default, the log file includes all types of message. Sometimes, user may only care about one type of messages, for example, they would like only error messages written to the log file. In order to achieve this, we can set an option called LOGMESSAGELEVEL in control file. The acceptable values for LOGMESSAGELEVEL are ALL, WARNING and ERROR. ALL: If the option LOGMESSAGELEVEL is set to ALL, all types of messages (Informational, Warning and Error) will be written into the log file. WARNING: If the option LOGMESSAGELEVEL is set to WARNING, only warning messages will be written into log file. ERROR: If the option LOGMESSAGELEVEL is set to ERROR, only error messages will be written into log file. UPDATEPROJECTATTRIBUTES, UPDATEMODULEATTRIBUTES      These two options are used to decide whether updating the attributes of projects/modules. The options work when projects/modules being imported already exist in repository and we use update metadata mode or replace metadata mode to do the MDL import. The acceptable values for these two options are Y or N. If the value is set to Y, the attributes of projects/modules will be updated, otherwise not.      Next, let’s give an example to see how these options take effect in MDL. 1. First of all, create the property file mdl.properties under the directory <owb installation path>/owb/bin/admin/. 2. Specify the options in the mdl.properties file, see the following screenshot. 3. Create the control file mdl_control_file.ctl under the directory /tmp/. Set the following options in control file. 4. Log into the OWB Design Client. 5. Create an Oracle module named ORA_MOD_1 under the project MY_PROJECT, then export the project MY_PROJECT into file my_project.mdl. 6. Check the trace file mdl.trc under the directory /tmp/. In this file, we can see very detail information for the above export task. 7. Check the exported MDL file. The file my_project.mdl is in text format. Opening the file, you can see the content of the file directly. It concats the file my_project.mdx and mdlcatalog.xml. 8. Modify the project MY_PROJECT and Oracle module ORA_MOD_1, add descriptions for them separately. Delete the location created in step 5. 9. Import the MDL file my_project.mdl. From the Metadata Import dialog, we can see the default directory for MDL file and log file has been changed to /tmp/. Here we use update metadata mode, match by names to do the importing. 10. After importing, check the description of the project MY_PROJECT, we can see the description is still there. But the description of the Oracle module ORA_MOD_1 has gone. That because we set the option UPDATEPROJECTATTRIBUTES to N, and set the option UPDATEMODULEATTRIBUTES to Y. 11. Check the log file, the log file only contains warning messages and the log message level is set to WARNING.      For more details about the 3 types of status messages, see Oracle® Warehouse Builder Installation and Administration Guide11g Release 2.

    Read the article

  • Does an alternative to regedit.exe exist?

    - by Peter Mortensen
    Is there something better than regedit.exe for searching and editing the Windows registry? Update 1: Registrar Registry Manager 6.50 from Resplendence Software Projects, free lite edition meets the two requirements listed below. Direct download URL (2.7 MB). Search can also be restricted to a particular branch in the registry. However exporting and sorting on columns in the search window is disabled in the lite version (and there may be other limitations). I haven't yet evaluated the other candidates. In particular I would like to be able to batch search instead of having to step through with F3. And a go to function that will open a particular registry key, e.g. HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\policies\Explorer, instead of having to go through from the top, Platform: Windows XP.

    Read the article

  • Converting Inkscape layers directly to a slideshow presentation?

    - by darenw
    I'm creating a slideshow in Inkscape. There's one or more layer per slide, plus several background layers, and some special layers to be used on several slides in a row, etc. In the past, I'd create each slide as a .png image by turning on appropriate layers and exporting an image. This is tedious and easy to make mistakes. Is it possible to automate this? To enable the right layers and save all slides directly to a file format suitable for presenatations, such as .ppt, which I can give to anyone for display? The solution will have to work on Linux.

    Read the article

  • Video capture tool: capture n frames per second

    - by Keikoku
    Is there a program that allows me to pass in a video and capture the screen at every n frames? For example maybe I want to export screenshots at 24 frames per second from point A to point B, so the program will be exporting 24 images per second. The amount of frames I would like to capture can be specified (maybe I only want 1 fps, maybe 10 fps, maybe 24, 30, 60, ..) Preferably, it would be part of a larger program that supports various video formats. At least, it should support the more common formats out there.

    Read the article

  • Restoring an Exchange 2010 user's calendar without rest of mailbox

    - by AlamedaDad
    I am trying to restore a user's calendar from backup, which was deleted by a sync problem on her mobile device. I've been able to restore her mailbox without a problem but I had to link it to a new AD user since she deleted the calendar several days before she reported the problem and the current backups of her account didn't include any calendar events, but all of her current email. I had to restore the mailbox from the day before she deleted everything. I've tried sharing the calendar and opening it in her account, then copying or moving the contents, but I get an error that outlook can't do the task because there are personal items. I tried bringing up the "Recovery User" I created, in Outlook and exporting the calendar events to a .pst, then importing them into the user's real account, but they all get created in a sub-folder called "Recovery User." In case it matters, she's running Outlook 2010 and we're using Exchange 2010 SP1. Thank in advance for help with this problem...Michael

    Read the article

  • How do you create large, growable, shared filesystems on Linux at AWS?

    - by Reece
    What are acceptable/reasonable/best ways to provide large, growable, shared storage at AWS, exposed as a single filesystem? We're currently making 1TB EBS volumes ~biweekly and NFS exporting with no_subtree_check and nohide. In this setup, distinct exports appear under a single mount on the client. This arrangement does not scale well. The options we've considered: LVM2 with ext4. resize2fs is too slow. Btrfs on Linux. not obviously ready for prime time yet. ZFS on Linux. not obviously ready for prime time yet (although LLNL uses it) ZFS on Solaris. future of this combo is uncertain (to me), and new OS in the mix glusterfs. heard mostly good but two scary (and maybe old?) stories. The ideal solution would provide sharing, a single fs view, easy expandability, snapshots, and replication. Thanks for sharing ideas and experience.

    Read the article

  • Consolidate SQL Server Reporting Services

    - by Eric C. Singer
    I've been a big fan of consolidating as many DB's to a few SQL servers for a while and I've had great success with it. However, I've never had to deal with SQL reporting services. Has anyone migrated SSRS from a bunch of random SQL servers into a consolidated SQL server? I don't exactly know a whole lot about SSRS which is part of the problem. To my knowlege, it's one DB per SSRS instance, so it sounds like i'd need to find a way of exporting data and merging it. Basically the process used to look like: Move DB from SQL Express to shared SQL server Change point in APP to point at new SQL server With reporting services, how do I move the reporting service compenent of the DB as well? I realize I may need to tweak the app, but my question is on the SQL side.

    Read the article

  • Drag/drop mail folder from Outlook to Explorer export to .pst file?

    - by Alex
    While working on large-scale projects, I collect all mail conversations pertaining to the project in a separate folder within Outlook. When the project is completed, I want to archive the entire folder, for reference purposes. So my question is this: Does anyone know of a tool/plugin for Outlook 2007/2010 that allows you to drag/drop a mailfolder from within Outlook, to a Windows Explorer window, exporting the folder as a .pst file, preserving folder structure and timestamps on mails? I am doing this manually at the moment, but having a tool that would use simple drag/drop-functionality would greatly simplify my workflow for handling project-specific mail.

    Read the article

  • mongoexport csv output array values

    - by 9point6
    I'm using mongoexport to export some collections into CSV files, however when I try to target fields which are members of an array I cannot get it to export correctly. command I'm using: mongoexport -d db -c collection -fieldFile fields.txt --csv > out.csv and the contents of fields.txt is similar to id name address[0].line1 address[0].line2 address[0].city address[0].country address[0].postcode where the BSON data would be: { "id": 1, "name": "example", "address": [ { "line1": "flat 123", "line2": "123 Fake St.", "city": "London", "country": "England", "postcode": "N1 1AA" } ] } what is the correct syntax for exporting the contents of an array?

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >