Search Results

Search found 69971 results on 2799 pages for 'hosts file'.

Page 16/2799 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Why do I need to republish a file when I update it?

    - by user13089
    The story is: I am coding a program and at the same time, i want to share it with my group members using ubuntu. They don't have an Ubuntu One account and what I do is to publish the file and my friends just get the link and the file. But, strangely or not, every time I upload a new version of this file (I add more code and save it), the used link seems to be "outdated" and I need to publish the same file again and to get a new link every time i modify the same file. Couldn't it be possible to save a new version of your file and the same link just "points" to the newer version?

    Read the article

  • .net- open excel file, format the file and save

    - by Lock
    I have an ASP web service that uses the Crystal Reports API to download an Excel report. Now, there is a few things I do not like about the Excel report that Crystal generates: - The column widths are static (as in they are not adjusted for the content). - I can't format the header row to be bold - If I suppress a data column in the report, it comes out in the Excel spreadsheet as a blank column. I currently use PHP to open the excel file, autosize the columns, bold the heading and remove blank columns, although using the PHPExcel class for this doesn't work well when the spreadsheet is only a few 100kb in size. I am thinking if I move this activity into the .NET web service, the performance will be much better. Does anyone know an efficient way of opening a Excel file and performing the operations listed above?

    Read the article

  • HTML client-side portable file generation - no external resources or server calls

    - by awashburn
    I have the following situation: I have set up a series of Cron jobs on an internal company server to run various PHP scripts designed to check data integrity. Each PHP script queries a company database, formats the returned query data into an HTML file containing one or more <tables>, and then mails the HTML file to several client emails as an attachment. From my experience, most of the PHP scripts generate HTML files with only a few tables, however there are a few PHP scripts the create HTML files with around 30 tables. HTML files have been chosen as the distribution format of these scans because HTML makes it easy to view many tables at once in a browser window. I would like to add the functionality for the clients to download a table in the HTML file as a CSV file. I anticipate clients using this feature when they suspect a data integrity issue based on the table data. It would be ideal for them to be able to take the table in question, export the data out to a CSV file, and then study it further. Because need for exporting the data to CSV format is at the discretion of the client, unpredictable as to what table will be under scrutiny, and intermittently used I do not want to create CSV files for every table. Normally creating a CSV file wouldn't be too difficult, using JavaScript/jQuery to preform DOM traversal and generate the CSV file data into a string utilizing a server call or flash library to facilitate the download process; but I have one limiting constraint: The HTML file needs to be "portable." I would like the clients to be able to take their HTML file and preform analysis of the data outside the company intranet. Also it is likely these HTML files will be archived, so making the export functionality "self contained" in the HTML files is a highly desirable feature for the two previous reasons. The "portable" constraint of CSV file generation from a HTML file means: I cannot make a server call. This means ALL the file generation must be done client-side. I want the single HTML file attached to the email to contain all the resources to generate the CSV file. This means I cannot use jQuery or flash libraries to generate the file. I understand, for obvious security reasons, that writing out files to disk using JavaScript isn't supported by any browser. I don't want to create a file without the user knowledge; I would like to generate the file using JavaScript in memory and then prompt the user the "download" the file from memory. I have looked into generating the CSV file as a URI however, according to my research and testing, this approach has a few problems: URIs for files are not supported by IE (See Here) URIs in FireFox saves the file with a random file name and as a .part file As much as it pains me, I can accept the fact the IE<=v9 won't create a URI for files. I would like to create a semi-cross-browser solution in which Chrome, Firefox, and Safari create a URI to download the CSV file after JavaScript DOM traversal compiles the data. My Example Table: <table> <thead class="resulttitle"> <tr> <th style="text-align:center;" colspan="3"> NameOfTheTable</th> </tr> </thead> <tbody> <tr class="resultheader"> <td>VEN_PK</td> <td>VEN_CompanyName</td> <td>VEN_Order</td> </tr> <tr> <td class='resultfield'>1</td> <td class='resultfield'>Brander Ranch</td> <td class='resultfield'>Beef</td> </tr> <tr> <td class='resultfield'>2</td> <td class='resultfield'>Super Tree Produce</td> <td class='resultfield'>Apples</td> </tr> <tr> <td class='resultfield'>3</td> <td class='resultfield'>John's Distilery</td> <td class='resultfield'>Beer</td> </tr> </tbody> <tfoot> <tr> <td colspan="3" style="text-align:right;"> <button onclick="doSomething(this);">Export to CSV File</button></td> </tr> </tfoot> </table> My Example JavaScript: <script type="text/javascript"> function doSomething(inButton) { /* locate elements */ var table = inButton.parentNode.parentNode.parentNode.parentNode; var name = table.rows[0].cells[0].textContent; var tbody = table.tBodies[0]; /* create CSV String through DOM traversal */ var rows = tbody.rows; var csvStr = ""; for (var i=0; i < rows.length; i++) { for (var j=0; j < rows[i].cells.length; j++) { csvStr += rows[i].cells[j].textContent +","; } csvStr += "\n"; } /* temporary proof DOM traversal was successful */ alert("Table Name:\t" + name + "\nCSV String:\n" + csvStr); /* Create URI Here! * (code I am missing) */ /* Approach 1 : Auto-download * downloads CSV data but: * In FireFox downloads as randomCharacers.part instead of name.csv * In Chrome downloads without prompting the user * In Safari opens the files in browser (textfile) */ //var hrefData = "data:text/csv;charset=US-ASCII," + encodeURIComponent(csvStr); //document.location.href = hrefData; /* Approach 2 : Right-Click Save As... */ var hrefData = "data:text/csv;charset=US-ASCII," + encodeURIComponent(csvStr); var fileLink = document.createElement("a"); fileLink.href = hrefData; fileLink.innerHTML = "download"; parentTD = inButton.parentNode; parentTD.appendChild(fileLink); parentTD.removeChild(inButton); } </script> I am looking for an example solution in which the above example table can be downloaded as a CSV file: using a URI the user is prompted to save the file the default filename is the name of the table. code works as described in modern versions of FireFox, Safari, & Chrome I have added a <script> tag with the DOM traversal function doSomething(). The real help I need is with formatting the URI to what I want within the doSomething() function.

    Read the article

  • Managing Many External Hosts Using EC2 and Route 53

    - by futureal
    Looking for a "best practice" answer to managing externally-addressable hosts using the combination of Amazon EC2 and Amazon Route 53, without using Elastic IPs for each host. In my scenario I will have 30+ hosts that need to be accessible from outside EC2, so directly using internal DNS will not work. In the past, I have addressed hosts by assigning an elastic IP to that host (let's say, 55.55.55.55) and then creating an associated A record. For example, let's say I want to create "ec2-corp01.mydomain.com" I might do: ec2-corp01.mydomain.com. A 55.55.55.55 300 Then on that EC2 instance, I would assign the Elastic IP of 55.55.55.55, and everything works fine. Of course, to make this work, I need to have one Elastic IP per instance, which is something I'd like to avoid if possible; I'd like the infrastructure to be more dynamic. So my thought is to try something like: Create a script that queries the internal EC2 tools to determine an instance's private hostname On instance boot, call that script to determine its hostname, and then using the command-line Route 53 interface to find and update that hostname to its current internal hostname Since the host will have a relatively low TTL (let's say 300 as above, or 5 minutes) it should take effect pretty quickly Is this a good idea? Is there a better or more widely accepted way to handle it? If it IS a good idea, what type of record should I be creating? A CNAME that points to the internal host, like ec2-55-55-55-55.compute-1.amazonaws.com? Is an A record better or worse? Thanks!

    Read the article

  • Slow network file transfer (under 20KB/s) on newly built x64 Win7

    - by Mangoshake
    I am getting <20KB/s for local network file transfer. If I transfer a very small file (less than 100KB) it would start quickly then slow down to <20KB/s. all subsequently network file transfer would be slow, a reboot is needed to reset this. If I transfer a large file it would be stuck on calculating for a long time and then begin with <20KB/s immediately. This is a newly built desktop running Windows 7 x64 SP1. Realtek gigabit LAN from the motherboard (ASRock Extreme3 gen3). Problematic speed is observed on the private LAN, both through ethernet and WiFi. The Router is D-Link DIR-655. Remote Differential Compression is off. Drivers are up-to-date from ASRock's website. I have tested network file transfer to and from another Windows 7 laptop and a MacBook Pro, so I am fairly certain it is the desktop's problem. The slow speed only happens with one direction also, outbound from the desktop, regardless of whether I initiate the file transfer action from the origin or the destination. Inbound network file transfer and internet speeds are fine, so I don't think this is a hardware issue. I am getting 74.8MB/s internet upload speed from speedtest.net (http://www.speedtest.net/result/1852752479.png). Inbound network file transfer I can get around 10-15MB/s. I am hoping this community has some insight for me to troubleshoot this. I don't see anything obviously related from the Event Viewer, and beyond that I just don't know where else to look. Any suggestions are greatly appreciated, thank you in advance.

    Read the article

  • ESX hosts lose connectivity with iSCSI SAN LUNs

    - by Themist
    I've been experiencing this issue for a couple of months now where my ESX hosts lose connectivity with my iSCSI SAN vmfs volumes. As a results the ESX hosts enter a nonresponsive mode the associated VMs disconnect and the only remedy is to reboot the host. This issue happens randomly . I have escalated this issue with VMWare but I haven't had any solution to the issue yet. I see no errors on my switches and there are no hardware issues as well. My SAN infrastucture is solid and there are 2 paths for every vmfs volume. Did anybody else experienced a similar issue? edit: Here are some more details: The iSCSI SAN software is Datacore Sanmelody 2.0.4.2 running on 2 HP Proliant G5 servers. The storage attached to each of the servers is an HP MSA70 and all the iSCSI SAN Volumes that are presented to my 4 ESX hosts are mirrored. I have two iSCSI swithces HP Procurve 1800G-24 that are trunked together. My SANLELODY servers are using NC360T NICs. I team two NICs and have one cable connecting to each iSCSi switch. Each ESX server uses two NICs as well for the iSCSI Network.

    Read the article

  • Cisco ASA 5505 network route for static IP hosts

    - by TheCapn
    I've configured my internal VLAN using the most basic settings where ports 1-7 are assigned from a pool of addresses in the range 192.168.15.5 - 192.168.15.36. These hosts are given access to the internet and it works great. What I'm trying to set up now is allowing users who are connected to the device and specify their IP (say I connect and request 192.168.15.45) are given internet access and can still work alongside DHCP hosts. Those with a DHCP assigned address are blocked from the internet. Mostly the issue resides in that I am very new to working with the device. I feel that the solution is easy but I'm not looking in the right spots and don't have the correct terminology down to google it. Do I need to define access control lists? Group policies? a new VLAN? The rules that are set up seem to be specific to the entire /24 subnet but when I request a static IP outside of the DHCP range I get blocked from other hosts and the internet.

    Read the article

  • Keeping track of File System Utilization in Ops Center 12c

    - by S Stelting
    Enterprise Manager Ops Center 12c provides significant monitoring capabilities, combined with very flexible incident management. These capabilities even extend to monitoring the file systems associated with Solaris or Linux assets. Depending on your needs you can monitor and manage incidents, or you can fine tune alert monitoring rules to specific file systems. This article will show you how to use Ops Center 12c to Track file system utilization Adjust file system monitoring rules Disable file system rules Create custom monitoring rules If you're interested in this topic, please join us for a WebEx presentation! Date: Thursday, November 8, 2012 Time: 11:00 am, Eastern Standard Time (New York, GMT-05:00) Meeting Number: 598 796 842 Meeting Password: oracle123 To join the online meeting ------------------------------------------------------- 1. Go to https://oracleconferencing.webex.com/oracleconferencing/j.php?ED=209833597&UID=1512095432&PW=NOWQ3YjJlMmYy&RT=MiMxMQ%3D%3D 2. If requested, enter your name and email address. 3. If a password is required, enter the meeting password: oracle123 4. Click "Join". To view in other time zones or languages, please click the link: https://oracleconferencing.webex.com/oracleconferencing/j.php?ED=209833597&UID=1512095432&PW=NOWQ3YjJlMmYy&ORT=MiMxMQ%3D%3D   Monitoring File Systems for OS Assets The Libraries tab provides basic, device-level information about the storage associated with an OS instance. This tab shows you the local file system associated with the instance and any shared storage libraries mounted by Ops Center. More detailed information about file system storage is available under the Analytics tab under the sub-tab named Charts. Here, you can select and display the individual mount points of an OS, and export the utilization data if desired: In this example, the OS instance has a basic root file partition and several NFS directories. Each file system mount point can be independently chosen for display in the Ops Center chart. File Systems and Incident  Reporting Every asset managed by Ops Center has a "monitoring policy", which determines what represents a reportable issue with the asset. The policy is made up of a bunch of monitoring rules, where each rule describes An attribute to monitor The conditions which represent an issue The level or levels of severity for the issue When the conditions are met, Ops Center sends a notification and creates an incident. By default, OS instances have three monitoring rules associated with file systems: File System Reachability: Triggers an incident if a file system is not reachable NAS Library Status: Triggers an incident for a value of "WARNING" or "DEGRADED" for a NAS-based file system File System Used Space Percentage: Triggers an incident when file system utilization grows beyond defined thresholds You can view these rules in the Monitoring tab for an OS: Of course, the default monitoring rules is that they apply to every file system associated with an OS instance. As a result, any issue with NAS accessibility or disk utilization will trigger an incident. This can cause incidents for file systems to be reported multiple times if the same shared storage is used by many assets, as shown in this screen shot: Depending on the level of control you'd like, there are a number of ways to fine tune incident reporting. Note that any changes to an asset's monitoring policy will detach it from the default, creating a new monitoring policy for the asset. If you'd like, you can extract a monitoring policy from an asset, which allows you to save it and apply the customized monitoring profile to other OS assets. Solution #1: Modify the Reporting Thresholds In some cases, you may want to modify the basic conditions for incident reporting in your file system. The changes you make to a default monitoring rule will apply to all of the file systems associated with your operating system. Selecting the File Systems Used Space Percentage entry and clicking the "Edit Alert Monitoring Rule Parameters" button opens a pop-up dialog which allows you to modify the rule. The first screen lets you decide when you will check for file system usage, and how long you will wait before opening an incident in Ops Center. By default, Ops Center monitors continuously and reports disk utilization issues which exist for more than 15 minutes. The second screen lets you define actual threshold values. By default, Ops Center opens a Warning level incident is utilization rises above 80%, and a Critical level incident for utilization above 95% Solution #2: Disable Incident Reporting for File System If you'd rather not report file system incidents, you can disable the monitoring rules altogether. In this case, you can select the monitoring rules and click the "Disable Alert Monitoring Rule(s)" button to open the pop-up confirmation dialog. Like the first solution, this option affects all file system monitoring. It allows you to completely disable incident reporting for NAS library status or file system space consumption. Solution #3: Create New Monitoring Rules for Specific File Systems If you'd like to have the greatest flexibility when monitoring file systems, you can create entirely new rules. Clicking the "Add Alert Monitoring Rule" (the icon with the green plus sign) opens a wizard which allows you to define a new rule.  This rule will be based on a threshold, and will be used to monitor operating system assets. We'd like to add a rule to track disk utilization for a specific file system - the /nfs-guest directory. To do this, we specify the following attribute FileSystemUsages.name=/nfs-guest.usedSpacePercentage The value of name in the attribute allows us to define a specific NFS shared directory or file system... in the case of this OS, we could have chosen any of the values shown in the File Systems Utilization chart at the beginning of this article. usedSpacePercentage lets us define a threshold based on the percentage of total disk space used. There are a number of other values that we could use for threshold-based monitoring of FileSystemUsages, including freeSpace freeSpacePercentage totalSpace usedSpace usedSpacePercentage The final sections of the screen allow us to determine when to monitor for disk usage, and how long to wait after utilization reaches a threshold before creating an incident. The next screen lets us define the threshold values and severity levels for the monitoring rule: If historical data is available, Ops Center will display it in the screen. Clicking the Apply button will create the new monitoring rule and active it in your monitoring policy. If you combine this with one of the previous solutions, you can precisely define which file systems will generate incidents and notifications. For example, this monitoring policy has the default "File System Used Space Percentage" rule disabled, but the new rule reports ONLY on utilization for the /nfs-guest directory. Stay Connected: Twitter |  Facebook |  YouTube |  Linkedin |  Newsletter

    Read the article

  • Bundler isn't loading gems

    - by Garrett
    I have been having a problem with using Bundler and being able to access my gems without having to require them somewhere, as config.gem used to do that for me (as far as I know). In my Rails 3 app, I defined my Gemfile like so: clear_sources source "http://gemcutter.org" source "http://gems.github.com" bundle_path "vendor/bundler_gems" ## Bundle edge rails: git "git://github.com/rails/arel.git" git "git://github.com/rails/rack.git" gem "rails", :git => "git://github.com/rails/rails.git" ## Bundle gem "mongo_mapper", :git => "git://github.com/jnunemaker/mongomapper.git" gem "bluecloth", ">= 2.0.0" Then I run gem bundle, it bundles it all up like expected. Inside the environment.rb file that is included within boot.rb it looks like this: # DO NOT MODIFY THIS FILE module Bundler file = File.expand_path(__FILE__) dir = File.dirname(file) ENV["PATH"] = "#{dir}/../../../../bin:#{ENV["PATH"]}" ENV["RUBYOPT"] = "-r#{file} #{ENV["RUBYOPT"]}" $LOAD_PATH.unshift File.expand_path("#{dir}/gems/builder-2.1.2/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/builder-2.1.2/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/text-hyphen-1.0.0/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/text-hyphen-1.0.0/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/i18n-0.3.3/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/i18n-0.3.3/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/arel/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/arel/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/activemodel/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/activemodel/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/jnunemaker-validatable-1.8.1/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/jnunemaker-validatable-1.8.1/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/abstract-1.0.0/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/abstract-1.0.0/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/erubis-2.6.5/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/erubis-2.6.5/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/mime-types-1.16/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/mime-types-1.16/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/mail-2.1.2/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/mail-2.1.2/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/rake-0.8.7/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/rake-0.8.7/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/railties/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/railties/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/memcache-client-1.7.7/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/memcache-client-1.7.7/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rack/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rack/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/rack-test-0.5.3/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/rack-test-0.5.3/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/rack-mount-0.4.5/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/rack-mount-0.4.5/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/actionpack/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/actionpack/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/bluecloth-2.0.7/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/bluecloth-2.0.7/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/bluecloth-2.0.7/ext") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/activerecord/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/activerecord/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/text-format-1.0.0/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/text-format-1.0.0/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/actionmailer/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/actionmailer/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/tzinfo-0.3.16/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/tzinfo-0.3.16/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/activesupport/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/activesupport/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/activeresource/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/activeresource/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/rails/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/mongo-0.18.2/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/gems/mongo-0.18.2/lib") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/mongomapper/bin") $LOAD_PATH.unshift File.expand_path("#{dir}/dirs/mongomapper/lib") @gemfile = "#{dir}/../../../../Gemfile" require "rubygems" unless respond_to?(:gem) # 1.9 already has RubyGems loaded @bundled_specs = {} @bundled_specs["builder"] = eval(File.read("#{dir}/specifications/builder-2.1.2.gemspec")) @bundled_specs["builder"].loaded_from = "#{dir}/specifications/builder-2.1.2.gemspec" @bundled_specs["text-hyphen"] = eval(File.read("#{dir}/specifications/text-hyphen-1.0.0.gemspec")) @bundled_specs["text-hyphen"].loaded_from = "#{dir}/specifications/text-hyphen-1.0.0.gemspec" @bundled_specs["i18n"] = eval(File.read("#{dir}/specifications/i18n-0.3.3.gemspec")) @bundled_specs["i18n"].loaded_from = "#{dir}/specifications/i18n-0.3.3.gemspec" @bundled_specs["arel"] = eval(File.read("#{dir}/specifications/arel-0.2.pre.gemspec")) @bundled_specs["arel"].loaded_from = "#{dir}/specifications/arel-0.2.pre.gemspec" @bundled_specs["activemodel"] = eval(File.read("#{dir}/specifications/activemodel-3.0.pre.gemspec")) @bundled_specs["activemodel"].loaded_from = "#{dir}/specifications/activemodel-3.0.pre.gemspec" @bundled_specs["jnunemaker-validatable"] = eval(File.read("#{dir}/specifications/jnunemaker-validatable-1.8.1.gemspec")) @bundled_specs["jnunemaker-validatable"].loaded_from = "#{dir}/specifications/jnunemaker-validatable-1.8.1.gemspec" @bundled_specs["abstract"] = eval(File.read("#{dir}/specifications/abstract-1.0.0.gemspec")) @bundled_specs["abstract"].loaded_from = "#{dir}/specifications/abstract-1.0.0.gemspec" @bundled_specs["erubis"] = eval(File.read("#{dir}/specifications/erubis-2.6.5.gemspec")) @bundled_specs["erubis"].loaded_from = "#{dir}/specifications/erubis-2.6.5.gemspec" @bundled_specs["mime-types"] = eval(File.read("#{dir}/specifications/mime-types-1.16.gemspec")) @bundled_specs["mime-types"].loaded_from = "#{dir}/specifications/mime-types-1.16.gemspec" @bundled_specs["mail"] = eval(File.read("#{dir}/specifications/mail-2.1.2.gemspec")) @bundled_specs["mail"].loaded_from = "#{dir}/specifications/mail-2.1.2.gemspec" @bundled_specs["rake"] = eval(File.read("#{dir}/specifications/rake-0.8.7.gemspec")) @bundled_specs["rake"].loaded_from = "#{dir}/specifications/rake-0.8.7.gemspec" @bundled_specs["railties"] = eval(File.read("#{dir}/specifications/railties-3.0.pre.gemspec")) @bundled_specs["railties"].loaded_from = "#{dir}/specifications/railties-3.0.pre.gemspec" @bundled_specs["memcache-client"] = eval(File.read("#{dir}/specifications/memcache-client-1.7.7.gemspec")) @bundled_specs["memcache-client"].loaded_from = "#{dir}/specifications/memcache-client-1.7.7.gemspec" @bundled_specs["rack"] = eval(File.read("#{dir}/specifications/rack-1.1.0.gemspec")) @bundled_specs["rack"].loaded_from = "#{dir}/specifications/rack-1.1.0.gemspec" @bundled_specs["rack-test"] = eval(File.read("#{dir}/specifications/rack-test-0.5.3.gemspec")) @bundled_specs["rack-test"].loaded_from = "#{dir}/specifications/rack-test-0.5.3.gemspec" @bundled_specs["rack-mount"] = eval(File.read("#{dir}/specifications/rack-mount-0.4.5.gemspec")) @bundled_specs["rack-mount"].loaded_from = "#{dir}/specifications/rack-mount-0.4.5.gemspec" @bundled_specs["actionpack"] = eval(File.read("#{dir}/specifications/actionpack-3.0.pre.gemspec")) @bundled_specs["actionpack"].loaded_from = "#{dir}/specifications/actionpack-3.0.pre.gemspec" @bundled_specs["bluecloth"] = eval(File.read("#{dir}/specifications/bluecloth-2.0.7.gemspec")) @bundled_specs["bluecloth"].loaded_from = "#{dir}/specifications/bluecloth-2.0.7.gemspec" @bundled_specs["activerecord"] = eval(File.read("#{dir}/specifications/activerecord-3.0.pre.gemspec")) @bundled_specs["activerecord"].loaded_from = "#{dir}/specifications/activerecord-3.0.pre.gemspec" @bundled_specs["text-format"] = eval(File.read("#{dir}/specifications/text-format-1.0.0.gemspec")) @bundled_specs["text-format"].loaded_from = "#{dir}/specifications/text-format-1.0.0.gemspec" @bundled_specs["actionmailer"] = eval(File.read("#{dir}/specifications/actionmailer-3.0.pre.gemspec")) @bundled_specs["actionmailer"].loaded_from = "#{dir}/specifications/actionmailer-3.0.pre.gemspec" @bundled_specs["tzinfo"] = eval(File.read("#{dir}/specifications/tzinfo-0.3.16.gemspec")) @bundled_specs["tzinfo"].loaded_from = "#{dir}/specifications/tzinfo-0.3.16.gemspec" @bundled_specs["activesupport"] = eval(File.read("#{dir}/specifications/activesupport-3.0.pre.gemspec")) @bundled_specs["activesupport"].loaded_from = "#{dir}/specifications/activesupport-3.0.pre.gemspec" @bundled_specs["activeresource"] = eval(File.read("#{dir}/specifications/activeresource-3.0.pre.gemspec")) @bundled_specs["activeresource"].loaded_from = "#{dir}/specifications/activeresource-3.0.pre.gemspec" @bundled_specs["rails"] = eval(File.read("#{dir}/specifications/rails-3.0.pre.gemspec")) @bundled_specs["rails"].loaded_from = "#{dir}/specifications/rails-3.0.pre.gemspec" @bundled_specs["mongo"] = eval(File.read("#{dir}/specifications/mongo-0.18.2.gemspec")) @bundled_specs["mongo"].loaded_from = "#{dir}/specifications/mongo-0.18.2.gemspec" @bundled_specs["mongo_mapper"] = eval(File.read("#{dir}/specifications/mongo_mapper-0.6.10.gemspec")) @bundled_specs["mongo_mapper"].loaded_from = "#{dir}/specifications/mongo_mapper-0.6.10.gemspec" def self.add_specs_to_loaded_specs Gem.loaded_specs.merge! @bundled_specs end def self.add_specs_to_index @bundled_specs.each do |name, spec| Gem.source_index.add_spec spec end end add_specs_to_loaded_specs add_specs_to_index def self.require_env(env = nil) context = Class.new do def initialize(env) @env = env && env.to_s ; end def method_missing(*) ; yield if block_given? ; end def only(*env) old, @only = @only, _combine_only(env.flatten) yield @only = old end def except(*env) old, @except = @except, _combine_except(env.flatten) yield @except = old end def gem(name, *args) opt = args.last.is_a?(Hash) ? args.pop : {} only = _combine_only(opt[:only] || opt["only"]) except = _combine_except(opt[:except] || opt["except"]) files = opt[:require_as] || opt["require_as"] || name files = [files] unless files.respond_to?(:each) return unless !only || only.any? {|e| e == @env } return if except && except.any? {|e| e == @env } if files = opt[:require_as] || opt["require_as"] files = Array(files) files.each { |f| require f } else begin require name rescue LoadError # Do nothing end end yield if block_given? true end private def _combine_only(only) return @only unless only only = [only].flatten.compact.uniq.map { |o| o.to_s } only &= @only if @only only end def _combine_except(except) return @except unless except except = [except].flatten.compact.uniq.map { |o| o.to_s } except |= @except if @except except end end context.new(env && env.to_s).instance_eval(File.read(@gemfile), @gemfile, 1) end end module Gem @loaded_stacks = Hash.new { |h,k| h[k] = [] } def source_index.refresh! super Bundler.add_specs_to_index end end But when I try to access any of my gems, e.g. MongoMapper, Paperclip, Haml, etc. I get: NameError: uninitialized constant MongoMapper The same goes for any other gem. Does Bundler not include gems like the old Rails 2.0 did? Or is something messed up with my system? Any help would be appreciated, thank you!

    Read the article

  • Installation of SAP Web Application Server 6.20 in Windows Vista

    - by karthikeyan b
    I tried installing trial version of sap netweaver 7.1 in windows vista in my laptop but i couldnt succeed there.. then i tried installing SAP WEB AS 6.20 now.I am able to succeed till 91% completion.After that i get some errors and the installation stops... if anybody have any experiences please share.it will be really helpful.I mentioned the complete log details below. nfo: INSTGUI.EXE Protocol version is 10. Message checksum is 7613888. Info: INSTGUI MessageFile Start message loading... Info: INSTGUI MessageFile Finished message loading. Info: InstController Prepare {} {} R3SETUP Version: Apr 24 2002 Info: InstController Prepare {} {} Logfile will be set to E:\R3SETUP\BSP.log Check E:\R3SETUP\BSP.log for further messages. Info: CommandFileController SyFileVersionSave {} {} Saving original content of file E:\R3SETUP\BSP.R3S ... Warning: CommandFileController SyFileCopy {} {} Function CopyFile() failed at location SyFileCopy-681 Warning: CommandFileController SyFileCopy {} {} errno: 5: Access is denied. Warning: CommandFileController SyFileCreateWithPermissions {} {} errno: 13: Permission denied Warning: CommandFileController SyPermissionSet {} {} Function SetNamedSecurityInfo() failed for E:\R3SETUP\BSP.R3S at location SyPermissionSet-2484 Warning: CommandFileController SyPermissionSet {} {} errno: 5: Access is denied. Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 309 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 309 CommandFile could not be updated Info: CDSERVERBASE InternalColdKeyCheck 2 309 The CD KERNEL will not be copied. Info: CDSERVERBASE InternalColdKeyCheck 2 309 The CD DATA1 will not be copied. Info: CDSERVERBASE InternalColdKeyCheck 2 309 The CD DATA2 will not be copied. Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 checking host name lookup for 'Karthikeyan' Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 offical host name for 'Karthikeyan' is 'Karthikeyan'. Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 host 'Karthikeyan' has ip address '115.184.71.93' Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 offical host name for '115.184.71.93' is 'Karthikeyan'. Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 checking host name lookup for 'Karthikeyan' Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 offical host name for 'Karthikeyan' is 'Karthikeyan'. Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 host 'Karthikeyan' has ip address '115.184.71.93' Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 offical host name for '115.184.71.93' is 'Karthikeyan'. Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 checking host name lookup for 'Karthikeyan' Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 offical host name for 'Karthikeyan' is 'Karthikeyan'. Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 host 'Karthikeyan' has ip address '115.184.71.93' Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 offical host name for '115.184.71.93' is 'Karthikeyan'. Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 checking host name lookup for 'Karthikeyan' Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 offical host name for 'Karthikeyan' is 'Karthikeyan'. Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 host 'Karthikeyan' has ip address '115.184.71.93' Info: CENTRDBINSTANCE_NT_ADA SyCheckHostnameLookup 2 333 offical host name for '115.184.71.93' is 'Karthikeyan'. Error: CommandFileController StoreMasterTableFromCommandFile 2 99 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 99 CommandFile could not be updated Warning: ADADBINSTANCE_IND_ADA GetConfirmationFor 2 58 Cleanup database instance BSP for new installation. Error: CommandFileController StoreMasterTableFromCommandFile 2 58 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 58 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1206 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1206 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 75 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 75 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 247 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 247 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1195 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1195 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1195 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1195 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 120 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 120 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 242 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 242 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 815 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 815 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 223 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 223 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 10 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 10 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 760 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 760 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1267 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1267 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1111 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1111 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1122 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1122 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1114 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1114 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 54 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 54 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1146 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1146 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 718 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 718 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 760 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 760 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 333 Requesting Installation Details Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Info: CDSERVERBASE InternalWarmKeyCheck 2 309 The CD KERNEL will not be copied. Info: CDSERVERBASE InternalWarmKeyCheck 2 309 The CD DATA1 will not be copied. Info: CDSERVERBASE InternalWarmKeyCheck 2 309 The CD DATA2 will not be copied. Info: InstController MakeStepsDeliver 2 309 Requesting Information on CD-ROMs Error: CommandFileController StoreMasterTableFromCommandFile 2 309 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 309 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 309 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 309 CommandFile could not be updated Info: CENTRDBINSTANCE_NT_ADA InternalWarmKeyCheck 2 333 The installation phase is starting now. Please look in the log file for further information about current actions. Info: InstController MakeStepsDeliver 2 333 Requesting Installation Details Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 99 Defining Key Values Error: CommandFileController StoreMasterTableFromCommandFile 2 99 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 99 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 99 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 99 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 58 Requesting Setup Details Error: CommandFileController StoreMasterTableFromCommandFile 2 58 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 58 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 58 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 58 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 333 Requesting Installation Details Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 333 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 333 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1206 Setting Users for Single DB landscape Error: CommandFileController StoreMasterTableFromCommandFile 2 1206 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1206 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1206 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1206 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 75 Stopping the SAP DB Instance Error: CommandFileController StoreMasterTableFromCommandFile 2 75 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 75 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 75 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 75 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 247 Stopping the SAP DB remote server Error: CommandFileController StoreMasterTableFromCommandFile 2 247 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 247 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 247 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 247 CommandFile could not be updated Warning: CDSERVERBASE ConfirmKey 2 1195 Can not read from Z:. Please ensure this path is accessible. Info: LvKeyRequest For further information see HTML documentation: step: SAPDBSETCDPATH_IND_IND and key: KERNEL_LOCATION Info: InstController MakeStepsDeliver 2 1195 Installing SAP DB Software Error: CommandFileController StoreMasterTableFromCommandFile 2 1195 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1195 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1195 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1195 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1195 Installing SAP DB Software Info: SAPDBINSTALL_IND_ADA SyCoprocessCreate 2 1195 Creating coprocess E:\sapdb\NT\I386\sdbinst.exe ... Info: SAPDBINSTALL_IND_ADA ExecuteDo 2 1195 RC code form SyCoprocessWait = 0 . Error: CommandFileController StoreMasterTableFromCommandFile 2 1195 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1195 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1195 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1195 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 120 Extracting SAP DB Tools Software Info: ADAEXTRACTLCTOOLS_NT_ADA SyCoprocessCreate 2 120 Creating coprocess SAPCAR ... Info: ADAEXTRACTLCTOOLS_NT_ADA SyCoprocessCreate 2 120 Creating coprocess SAPCAR ... Error: CommandFileController StoreMasterTableFromCommandFile 2 120 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 120 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 120 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 120 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 816 Extracting the Database-Dependent SAP system Executables Info: ADAEXTRACTBSPCFG SyCoprocessCreate 2 816 Creating coprocess SAPCAR ... Info: ADAEXTRACTBSPCFG SyCoprocessCreate 2 816 Creating coprocess SAPCAR ... Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 242 Starting VSERVER Info: ADAXSERVER_NT_ADA SyCoprocessCreate 2 242 Creating coprocess C:\sapdb\programs\bin\x_server.exe ... Info: ADAXSERVER_NT_ADA ExecuteDo 2 242 RC code form SyCoprocessWait = 0 . Error: CommandFileController StoreMasterTableFromCommandFile 2 242 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 242 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 242 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 242 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 816 Extracting the Database-Dependent SAP system Executables Info: EXTRACTSAPEXEDBDATA1 SyCoprocessCreate 2 816 Creating coprocess SAPCAR ... Info: EXTRACTSAPEXEDBDATA1 SyCoprocessCreate 2 816 Creating coprocess SAPCAR ... Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Warning: CDSERVERBASE ConfirmKey 2 816 Can not read from Y:. Please ensure this path is accessible. Info: LvKeyRequest For further information see HTML documentation: step: EXTRACTSAPEXEDBDATA2 and key: DATA1_LOCATION Info: InstController MakeStepsDeliver 2 816 Extracting the Database-Dependent SAP system Executables Info: EXTRACTSAPEXEDBDATA2 SyCoprocessCreate 2 816 Creating coprocess SAPCAR ... Info: EXTRACTSAPEXEDBDATA2 SyCoprocessCreate 2 816 Creating coprocess SAPCAR ... Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Warning: CDSERVERBASE ConfirmKey 2 816 Can not read from X:. Please ensure this path is accessible. Info: LvKeyRequest For further information see HTML documentation: step: EXTRACTSAPEXEDBDATA3 and key: DATA2_LOCATION Info: InstController MakeStepsDeliver 2 816 Extracting the Database-Dependent SAP system Executables Info: EXTRACTSAPEXEDBDATA3 SyCoprocessCreate 2 816 Creating coprocess SAPCAR ... Info: EXTRACTSAPEXEDBDATA3 SyCoprocessCreate 2 816 Creating coprocess SAPCAR ... Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 816 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 816 CommandFile could not be updated Warning: CDSERVERBASE ConfirmKey 2 815 We tried to find the label SAPDB:MINI-WAS-DEMO:620:KERNEL for CD KERNEL in path E:. But the check was not successfull. Info: LvKeyRequest For further information see HTML documentation: step: EXTRACTSAPEXE and key: KERNEL_LOCATION Info: InstController MakeStepsDeliver 2 815 Extracting the SAP Executables Info: EXTRACTSAPEXE SyCoprocessCreate 2 815 Creating coprocess SAPCAR ... Info: EXTRACTSAPEXE SyCoprocessCreate 2 815 Creating coprocess SAPCAR ... Error: CommandFileController StoreMasterTableFromCommandFile 2 815 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 815 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 815 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 815 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 223 Setting new Rundirectory. Info: ADADBREGISTER_IND_ADA SyCoprocessCreate 2 223 Creating coprocess C:\sapdb\programs\pgm\dbmcli.exe ... Info: ADADBREGISTER_IND_ADA ExecuteDo 2 223 RC code form SyCoprocessWait = 0 . Error: CommandFileController StoreMasterTableFromCommandFile 2 223 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 223 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 223 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 223 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1 ADASETDEVSPACES_IND_ADA Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 10 Performing Service BCHECK Error: CommandFileController StoreMasterTableFromCommandFile 2 10 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 10 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 10 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 10 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 263 Creating XUSER File for the User ADM for Dialog Instance Info: ADAXUSERSIDADM_DEFAULT_NT_ADA SyCoprocessCreate 2 263 Creating coprocess C:\sapdb\programs\pgm\dbmcli.exe ... Info: ADAXUSERSIDADM_DEFAULT_NT_ADA ExecuteDo 2 263 RC code form SyCoprocessWait = 0 . Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 263 Creating XUSER File for the User ADM for Dialog Instance Info: ADAXUSERSIDADM_COLD_NT_ADA SyCoprocessCreate 2 263 Creating coprocess C:\sapdb\programs\pgm\dbmcli.exe ... Info: ADAXUSERSIDADM_COLD_NT_ADA ExecuteDo 2 263 RC code form SyCoprocessWait = 0 . Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 263 Creating XUSER File for the User ADM for Dialog Instance Info: ADAXUSERSIDADM_WARM_NT_ADA SyCoprocessCreate 2 263 Creating coprocess C:\sapdb\programs\pgm\dbmcli.exe ... Info: ADAXUSERSIDADM_WARM_NT_ADA ExecuteDo 2 263 RC code form SyCoprocessWait = 0 . Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 263 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 263 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 760 Creating the Default Profile Info: DEFAULTPROFILE_IND_IND SyFileVersionSave 2 760 Saving original content of file C:\MiniWAS\DEFAULT.PFL ... Error: CommandFileController StoreMasterTableFromCommandFile 2 760 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 760 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 760 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 760 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1267 Modifying or Creating the TPPARAM File Info: TPPARAMMODIFY_NT_ADA SyFileVersionSave 2 1267 Saving original content of file C:\MiniWAS\trans\bin\TPPARAM ... Error: CommandFileController StoreMasterTableFromCommandFile 2 1267 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1267 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1267 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1267 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1111 Creating the Service Entry for the Dispatcher Info: R3DISPATCHERPORT_IND_IND IaServicePortAppend 2 1111 Checking service name sapdp00, protocol tcp, port number 3200 ... Info: R3DISPATCHERPORT_IND_IND IaServicePortAppend 2 1111 Port name sapdp00 is known and the port number 3200 is equal to the existing port number 3200 Error: CommandFileController StoreMasterTableFromCommandFile 2 1111 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1111 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1111 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1111 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1122 Creating the Service Entry for the Message Server Info: R3MESSAGEPORT_IND_IND IaServicePortAppend 2 1122 Checking service name sapmsBSP, protocol tcp, port number 3600 ... Info: R3MESSAGEPORT_IND_IND IaServicePortAppend 2 1122 Port name sapmsBSP is known and the port number 3600 is equal to the existing port number 3600 Error: CommandFileController StoreMasterTableFromCommandFile 2 1122 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1122 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1122 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1122 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1114 Creating the Service Entry for the Gateway Service Info: R3GATEWAYPORT_IND_IND IaServicePortAppend 2 1114 Checking service name sapgw00, protocol tcp, port number 3300 ... Info: R3GATEWAYPORT_IND_IND IaServicePortAppend 2 1114 Port name sapgw00 is known and the port number 3300 is equal to the existing port number 3300 Error: CommandFileController StoreMasterTableFromCommandFile 2 1114 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1114 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1114 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1114 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1 PISTARTBSP Info: PISTARTBSP SyDirCreate 2 1 Checking existence of directory C:\Users\Karthikeyan\Desktop. If it does not exist creating it with user , group and permission 0 ... Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 1 PISTARTBSPGROUP Info: PISTARTBSPGROUP SyDirCreate 2 1 Checking existence of directory C:\Users\Karthikeyan\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Mini SAP Web Application Server. If it does not exist creating it with user , group and permission 0 ... Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Error: CommandFileController StoreMasterTableFromCommandFile 2 1 Command file could not be opened. Error: CommandFileController SetKeytableForSect 2 1 CommandFile could not be updated Info: InstController MakeStepsDeliver 2 54 Starting SAP DB to Mode WARM Error: Command

    Read the article

  • Copy a file to a new directory path in DOS

    - by nodmonkey
    How can I copy a file using DOS commands into a directory structure that may not yet exist? I need to be able to force the creation of the directory path to the target file location if that location doesn't already exist. For example, there is already a file.txt in this location: C:\file.txt And I want to copy it to C:\example\new\path\to\copy\of\file\file.txt but at this time C:\example\ and all the subdirectories may or may not yet exist. Basically, I am looking for a "copy and create the target path if necessary" command. What would you recommend is the best way to achieve this?

    Read the article

  • Windows 2008 File Share

    - by user36540
    Hi, I have 3 Windows 2008 Standard servers in my system with no domain controller. Two of the servers are running a NLB cluster and the third server is a file server that the web servers connect to. I want to store my source code on the file server and point the IIS config to the network file share. The web sites also need access to a file share on the file server. I was able to share the network drive and access while logged into either of the web servers but my web apps are unable to access the file share - I assume due to permissions. Does anybody know the correct way to do this? Thanks, Chris

    Read the article

  • Delete file then run file at startup

    - by Henry Gibson
    I'm running the music player Foobar2000 through Wine at startup. For some reason when I shutdown Ubuntu the Foobar2000 process is ended abnormally in Wine and when it runs next time I get an annoying "start in safe mode?" message. Not a huge problem, but I'd like it fixed. The safe mode message only appears if a file called "running" is present when Foobar2000 starts (if it isn't deleted when closed properly). So by deleting "running" then starting Foobar2000, the message doesn't appear. I thought it would be easy enough to enter this as a startup command, however it doesn't want to work. The command I am using is rm '/home/henry/.wine/drive_c/users/henry/Application Data/foobar2000/running';'/home/henry/.wine/drive_c/Program Files (x86)/foobar2000/foobar2000.exe' which works fine if I just run it from terminal, the file is deleted then foobar2000 runs. Does anyone know why this isn't working at startup? Also, will this run with a terminal visible? How can I make just the gui appear? Thanks

    Read the article

  • How to associate all file types within Wine with its corresponding native application?

    - by MestreLion
    This is easily done for a single file type, as answered in How to associate a file type within Wine with a native application?, by creating a .reg for the desired filetype. But this is for AVI only. I use some wine apps (uTorrent, Soulseek, Eudora, to name a few) that can launch a wide range of files. Email attachments, for example, can be JPG, DOC, PDF, PPS... its impossible (and not desirable) to track down all possible file types that one may receive in an email or download in a torrent. So I neeed a solution to be more generic and broad. I need the file association to honor whatever native app is currently configured. And I want this to be done for all file types configured in my system. I've already figured out how to make the solution generic. Simply replacing the launched app in .reg for winebrowser, like this: [HKEY_CLASSES_ROOT\.pdf] @="PDFfile" "Content Type"="application/pdf" [HKEY_CLASSES_ROOT\PDFfile\Shell\Open\command] @="C:\\windows\\system32\\winebrowser.exe \"%1\"" Ive tested this and it works correctly. Since winebrowser uses xdg-open as a backend, and converts my windows path to a Unix one, the correct (Linux) app is launched. So I need a "batch" updater to wine's registry, sort of a wine-update-associations script that I can run whenever a new app is installed. Maybe a tool that can: List all Mime Types types in my system that have a default, installed app associated Extract all the needed info (glob, mime type, etc) Generate the .REG file in the above format The tricky part is: i've searched a LOT to find info about how association is done in Ubuntu 10.10 onwards, and documentation is scarce and confusing, to say the least. Freedesktop.org has no complete spec, and even Gnome docs are obsolete. So far I've gathered 4 files that contain association info, but im clueless on which (or why) to use, or how to use them to generate the .reg file: ~/.local/share/applications/mimeapps.list ~/.local/share/applications/miminfo.cache /usr/share/applications/miminfo.cache /etc/gnome/defaults.list Any help, script or explanation would be greatly appreciated! Thanks!

    Read the article

  • RuntimeError: maximum recursion depth exceeded while calling a Python object

    - by Bilal Basharat
    this error arises when i try to run the following test case which is written in models.py of my django app named 'administration' : from django.test import Client, TestCase from django.core import mail class ClientTest( TestCase ): fixtures = [ 'testdata.json' ] def test_get_register( self ): response = self.client.get( '/accounts/register/', {} ) self.assertEqual( response.status_code, 200 ) the error arises at this line specifically: response = self.client.get( '/accounts/register/', {} ) my django version is 1.2.1 and python 2.6 and satchmo version is 0.9.2-pre hg-unknown. I code in windows platform(xp sp2). The command to run test case is: python manage.py test administration the complete error log is as follow: site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 121, in by_host site = by_host(host=host[4:], id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 124, in by_host site = by_host(host = 'www.%s' % host, id_only=id_only) File "build\bdist.win32\egg\threaded_multihost\sites.py", line 101, in by_host site = Site.objects.get(domain=host) File "C:\django\django\db\models\manager.py", line 132, in get return self.get_query_set().get(*args, **kwargs) File "C:\django\django\db\models\query.py", line 336, in get num = len(clone) File "C:\django\django\db\models\query.py", line 81, in __len__ self._result_cache = list(self.iterator()) File "C:\django\django\db\models\query.py", line 269, in iterator for row in compiler.results_iter(): File "C:\django\django\db\models\sql\compiler.py", line 672, in results_iter for rows in self.execute_sql(MULTI): File "C:\django\django\db\models\sql\compiler.py", line 717, in execute_sql sql, params = self.as_sql() File "C:\django\django\db\models\sql\compiler.py", line 65, in as_sql where, w_params = self.query.where.as_sql(qn=qn, connection=self.connection) File "C:\django\django\db\models\sql\where.py", line 91, in as_sql sql, params = child.as_sql(qn=qn, connection=connection) File "C:\django\django\db\models\sql\where.py", line 94, in as_sql sql, params = self.make_atom(child, qn, connection) File "C:\django\django\db\models\sql\where.py", line 141, in make_atom lvalue, params = lvalue.process(lookup_type, params_or_value, connection) File "C:\django\django\db\models\sql\where.py", line 312, in process connection=connection, prepared=True) File "C:\django\django\db\models\fields\subclassing.py", line 53, in inner return func(*args, **kwargs) File "C:\django\django\db\models\fields\subclassing.py", line 53, in inner return func(*args, **kwargs) File "C:\django\django\db\models\fields\__init__.py", line 323, in get_db_prep _lookup return [self.get_db_prep_value(value, connection=connection, prepared=prepar ed)] File "C:\django\django\db\models\fields\subclassing.py", line 53, in inner return func(*args, **kwargs) File "C:\django\django\db\models\fields\subclassing.py", line 53, in inner return func(*args, **kwargs) RuntimeError: maximum recursion depth exceeded while calling a Python object ---------------------------------------------------------------------- Ran 7 tests in 48.453s FAILED (errors=1) Destroying test database 'default'...

    Read the article

  • Program for remove exact duplicate files while caching search results

    - by John Thomas
    We need a Windows 7 program to remove/check the duplicates but our situation is somewhat different than the standard one for which there are enough programs. We have a fairly large static archive (collection) of photos spread on several disks. Let's call them Disk A..M. We have also some disks (let's call them Disk 1..9) which contain some duplicates which are to be found on disks A..M. We want to add to our collection new disks (N, O, P... aso.) which will contain the photos from disks 1..9 but, of course, we don't want to have any photos two (or more) times. Of course, theoretically, the task can be solved with a regular file duplicate remover but the time needed will be very big. Ideally, AFAIS now, the real solution would be a program which will scan the disks A..M, store the file sizes/hashes of the photos in an indexed database/file(s) and will check the new disks (1..9) against this database. However I have hard time to find such a program (if exists). Other things to note: we consider that the Disks A..M (the collection) doesn't have any duplicates on them the file names might be changed we aren't interested in approximated (fuzzy) comparison which can be found in some photo comparing programs. We hunt for exact duplicate files. we aren't afraid of command line. :-) we need to work on Win7/XP we prefer (of course) to be freeware TIA for any suggestions, John Th.

    Read the article

  • Create and Share a File (Also a mysterious error)

    - by Kirk
    My goal is to create a XML file and then send it through the share Intent. I'm able to create a XML file using this code FileOutputStream outputStream = context.openFileOutput(fileName, Context.MODE_WORLD_READABLE); PrintStream printStream = new PrintStream(outputStream); String xml = this.writeXml(); // get XML here printStream.println(xml); printStream.close(); I'm stuck trying to retrieve a Uri to the output file in order to share it. I first tried to access the file by converting the file to a Uri File outFile = context.getFileStreamPath(fileName); return Uri.fromFile(outFile); This returns file:///data/data/com.my.package/files/myfile.xml but I cannot appear to attach this to an email, upload, etc. If I manually check the file length, it's proper and shows there is a reasonable file size. Next I created a content provider and tried to reference the file and it isn't a valid handle to the file. The ContentProvider doesn't ever seem to be called a any point. Uri uri = Uri.parse("content://" + CachedFileProvider.AUTHORITY + "/" + fileName); return uri; This returns content://com.my.package.provider/myfile.xml but I check the file and it's zero length. How do I access files properly? Do I need to create the file with the content provider? If so, how? Update Here is the code I'm using to share. If I select Gmail, it does show as an attachment but when I send it gives an error Couldn't show attachment and the email that arrives has no attachment. public void onClick(View view) { Log.d(TAG, "onClick " + view.getId()); switch (view.getId()) { case R.id.share_cancel: setResult(RESULT_CANCELED, getIntent()); finish(); break; case R.id.share_share: MyXml xml = new MyXml(); Uri uri; try { uri = xml.writeXmlToFile(getApplicationContext(), "myfile.xml"); //uri is "file:///data/data/com.my.package/files/myfile.xml" Log.d(TAG, "Share URI: " + uri.toString() + " path: " + uri.getPath()); File f = new File(uri.getPath()); Log.d(TAG, "File length: " + f.length()); // shows a valid file size Intent shareIntent = new Intent(); shareIntent.setAction(Intent.ACTION_SEND); shareIntent.putExtra(Intent.EXTRA_STREAM, uri); shareIntent.setType("text/plain"); startActivity(Intent.createChooser(shareIntent, "Share")); } catch (FileNotFoundException e) { e.printStackTrace(); } break; } } I noticed that there is an Exception thrown here from inside createChooser(...), but I can't figure out why it's thrown. E/ActivityThread(572): Activity com.android.internal.app.ChooserActivity has leaked IntentReceiver com.android.internal.app.ResolverActivity$1@4148d658 that was originally registered here. Are you missing a call to unregisterReceiver()? I've researched this error and can't find anything obvious. Both of these links suggest that I need to unregister a receiver. Chooser Activity Leak - Android Why does Intent.createChooser() need a BroadcastReceiver and how to implement? I have a receiver setup, but it's for an AlarmManager that is set elsewhere and doesn't require the app to register / unregister. Code for openFile(...) In case it's needed, here is the content provider I've created. public ParcelFileDescriptor openFile(Uri uri, String mode) throws FileNotFoundException { String fileLocation = getContext().getCacheDir() + "/" + uri.getLastPathSegment(); ParcelFileDescriptor pfd = ParcelFileDescriptor.open(new File(fileLocation), ParcelFileDescriptor.MODE_READ_ONLY); return pfd; }

    Read the article

  • SQL SERVER – Import CSV into Database – Transferring File Content into a Database Table using CSVexpress

    - by pinaldave
    One of the most common data integration tasks I run into is a desire to move data from a file into a database table.  Generally the user is familiar with his data, the structure of the file, and the database table, but is unfamiliar with data integration tools and therefore views this task as something that is difficult.  What these users really need is a point and click approach that minimizes the learning curve for the data integration tool.  This is what CSVexpress (www.CSVexpress.com) is all about!  It is based on expressor Studio, a data integration tool I’ve been reviewing over the last several months. With CSVexpress, moving data between data sources can be as simple as providing the database connection details, describing the structure of the incoming and outgoing data and then connecting two pre-programmed operators.   There’s no need to learn the intricacies of the data integration tool or to write code.  Let’s look at an example. Suppose I have a comma separated value data file with data similar to the following, which is a listing of terminated employees that includes their hiring and termination date, department, job description, and final salary. EMP_ID,STRT_DATE,END_DATE,JOB_ID,DEPT_ID,SALARY 102,13-JAN-93,24-JUL-98 17:00,Programmer,60,"$85,000" 101,21-SEP-89,27-OCT-93 17:00,Account Representative,110,"$65,000" 103,28-OCT-93,15-MAR-97 17:00,Account Manager,110,"$75,000" 304,17-FEB-96,19-DEC-99 17:00,Marketing,20,"$45,000" 333,24-MAR-98,31-DEC-99 17:00,Data Entry Clerk,50,"$35,000" 100,17-SEP-87,17-JUN-93 17:00,Administrative Assistant,90,"$40,000" 334,24-MAR-98,31-DEC-98 17:00,Sales Representative,80,"$40,000" 400,01-JAN-99,31-DEC-99 17:00,Sales Manager,80,"$55,000" Notice the concise format used for the date values, the fact that the termination date includes both date and time information, and that the salary is clearly identified as money by the dollar sign and digit grouping.  In moving this data to a database table I want to express the dates using a format that includes the century since it’s obvious that this listing could include employees who left the company in both the 20th and 21st centuries, and I want the salary to be stored as a decimal value without the currency symbol and grouping character.  Most data integration tools would require coding within a transformation operation to effect these changes, but not expressor Studio.  Directives for these modifications are included in the description of the incoming data. Besides starting the expressor Studio tool and opening a project, the first step is to create connection artifacts, which describe to expressor where data is stored.  For this example, two connection artifacts are required: a file connection, which encapsulates the file system location of my file; and a database connection, which encapsulates the database connection information.  With expressor Studio, I use wizards to create these artifacts. First click New Connection > File Connection in the Home tab of expressor Studio’s ribbon bar, which starts the File Connection wizard.  In the first window, I enter the path to the directory that contains the input file.  Note that the file connection artifact only specifies the file system location, not the name of the file. Then I click Next and enter a meaningful name for this connection artifact; clicking Finish closes the wizard and saves the artifact. To create the Database Connection artifact, I must know the location of, or instance name, of the target database and have the credentials of an account with sufficient privileges to write to the target table.  To use expressor Studio’s features to the fullest, this account should also have the authority to create a table. I click the New Connection > Database Connection in the Home tab of expressor Studio’s ribbon bar, which starts the Database Connection wizard.  expressor Studio includes high-performance drivers for many relational database management systems, so I can simply make a selection from the “Supplied database drivers” drop down control.  If my desired RDBMS isn’t listed, I can optionally use an existing ODBC DSN by selecting the “Existing DSN” radio button. In the following window, I enter the connection details.  With Microsoft SQL Server, I may choose to use Windows Authentication rather than rather than account credentials.  After clicking Next, I enter a meaningful name for this connection artifact and clicking Finish closes the wizard and saves the artifact. Now I create a schema artifact, which describes the structure of the file data.  When expressor reads a file, all data fields are typed as strings.  In some use cases this may be exactly what is needed and there is no need to edit the schema artifact.  But in this example, editing the schema artifact will be used to specify how the data should be transformed; that is, reformat the dates to include century designations, change the employee and job ID’s to integers, and convert the salary to a decimal value. Again a wizard is used to create the schema artifact.  I click New Schema > Delimited Schema in the Home tab of expressor Studio’s ribbon bar, which starts the Database Connection wizard.  In the first window, I click Get Data from File, which then displays a listing of the file connections in the project.  When I click on the file connection I previously created, a browse window opens to this file system location; I then select the file and click Open, which imports 10 lines from the file into the wizard. I now view the file’s content and confirm that the appropriate delimiter characters are selected in the “Field Delimiter” and “Record Delimiter” drop down controls; then I click Next. Since the input file includes a header row, I can easily indicate that fields in the file should be identified through the corresponding header value by clicking “Set All Names from Selected Row. “ Alternatively, I could enter a different identifier into the Field Details > Name text box.  I click Next and enter a meaningful name for this schema artifact; clicking Finish closes the wizard and saves the artifact. Now I open the schema artifact in the schema editor.  When I first view the schema’s content, I note that the types of all attributes in the Semantic Type (the right-hand panel) are strings and that the attribute names are the same as the field names in the data file.  To change an attribute’s name and type, I highlight the attribute and click Edit in the Attributes grouping on the Schema > Edit tab of the editor’s ribbon bar.  This opens the Edit Attribute window; I can change the attribute name and select the desired type from the “Data type” drop down control.  In this example, I change the name of each attribute to the name of the corresponding database table column (EmployeeID, StartingDate, TerminationDate, JobDescription, DepartmentID, and FinalSalary).  Then for the EmployeeID and DepartmentID attributes, I select Integer as the data type, for the StartingDate and TerminationDate attributes, I select Datetime as the data type, and for the FinalSalary attribute, I select the Decimal type. But I can do much more in the schema editor.  For the datetime attributes, I can set a constraint that ensures that the data adheres to some predetermined specifications; a starting date must be later than January 1, 1980 (the date on which the company began operations) and a termination date must be earlier than 11:59 PM on December 31, 1999.  I simply select the appropriate constraint and enter the value (1980-01-01 00:00 as the starting date and 1999-12-31 11:59 as the termination date). As a last step in setting up these datetime conversions, I edit the mapping, describing the format of each datetime type in the source file. I highlight the mapping line for the StartingDate attribute and click Edit Mapping in the Mappings grouping on the Schema > Edit tab of the editor’s ribbon bar.  This opens the Edit Mapping window in which I either enter, or select, a format that describes how the datetime values are represented in the file.  Note the use of Y01 as the syntax for the year.  This syntax is the indicator to expressor Studio to derive the century by setting any year later than 01 to the 20th century and any year before 01 to the 21st century.  As each datetime value is read from the file, the year values are transformed into century and year values. For the TerminationDate attribute, my format also indicates that the datetime value includes hours and minutes. And now to the Salary attribute. I open its mapping and in the Edit Mapping window select the Currency tab and the “Use currency” check box.  This indicates that the file data will include the dollar sign (or in Europe the Pound or Euro sign), which should be removed. And on the Grouping tab, I select the “Use grouping” checkbox and enter 3 into the “Group size” text box, a comma into the “Grouping character” text box, and a decimal point into the “Decimal separator” character text box. These entries allow the string to be properly converted into a decimal value. By making these entries into the schema that describes my input file, I’ve specified how I want the data transformed prior to writing to the database table and completely removed the requirement for coding within the data integration application itself. Assembling the data integration application is simple.  Onto the canvas I drag the Read File and Write Table operators, connecting the output of the Read File operator to the input of the Write Table operator. Next, I select the Read File operator and its Properties panel opens on the right-hand side of expressor Studio.  For each property, I can select an appropriate entry from the corresponding drop down control.  Clicking on the button to the right of the “File name” text box opens the file system location specified in the file connection artifact, allowing me to select the appropriate input file.  I indicate also that the first row in the file, the header row, should be skipped, and that any record that fails one of the datetime constraints should be skipped. I then select the Write Table operator and in its Properties panel specify the database connection, normal for the “Mode,” and the “Truncate” and “Create Missing Table” options.  If my target table does not yet exist, expressor will create the table using the information encapsulated in the schema artifact assigned to the operator. The last task needed to complete the application is to create the schema artifact used by the Write Table operator.  This is extremely easy as another wizard is capable of using the schema artifact assigned to the Read Table operator to create a schema artifact for the Write Table operator.  In the Write Table Properties panel, I click the drop down control to the right of the “Schema” property and select “New Table Schema from Upstream Output…” from the drop down menu. The wizard first displays the table description and in its second screen asks me to select the database connection artifact that specifies the RDBMS in which the target table will exist.  The wizard then connects to the RDBMS and retrieves a list of database schemas from which I make a selection.  The fourth screen gives me the opportunity to fine tune the table’s description.  In this example, I set the width of the JobDescription column to a maximum of 40 characters and select money as the type of the LastSalary column.  I also provide the name for the table. This completes development of the application.  The entire application was created through the use of wizards and the required data transformations specified through simple constraints and specifications rather than through coding.  To develop this application, I only needed a basic understanding of expressor Studio, a level of expertise that can be gained by working through a few introductory tutorials.  expressor Studio is as close to a point and click data integration tool as one could want and I urge you to try this product if you have a need to move data between files or from files to database tables. Check out CSVexpress in more detail.  It offers a few basic video tutorials and a preview of expressor Studio 3.5, which will support the reading and writing of data into Salesforce.com. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • confusion about installing/using git; how to undo

    - by dan
    I'm very new to ubuntu so I'm sure this is a dumb question. I wanted to install some source code that was on git. Don't really know what that means, I've never used git before, but I figured it was time to learn so I first installed git. Next I tried to clone the git directory of the software I want to install. I got a message saying "the authenticity of IP:IP:IP:IP can't be established". I went ahead and ended up with another message saying warning such and such will be added to known hosts. I went ahead and it said something about hanging up on the connection. After searching the internet for awhile I realized I didn't need git to install the software but now I have it installed and have added some host to some file or another. I'm concerned I've created some security issues I need to fix. I know this is stupid but can anyone help me undo what I've done, or better understand what I've done. Did adding a git project open up my system? Beyond that can anyone tell me how git works. Everything I've found assumes I know stuff that I don't yet. Thanks. Dan

    Read the article

  • How do I compare the md5sum of a file with the md5 file (that was available to download with the file)?

    - by user91583
    Images are available for a distro on http://livedistro.org/gnulinux/israel-remix-team-mint-12. I want to use the 32-bit version. I have downloaded the ISO file for the 32-bit version (customdist.iso). I have downloaded the md5 file for the ISO file (customdist.iso.md5). I want to calculate the md5sum of the ISO file and compare it to the md5 file. I can use the md5sum command to display within the terminal the calculated md5 for the ISO file. I have searched the web and can't find a way to compare the calculated md5 for the ISO file with the downloaded md5 file. So far, the closest I have come is the command md5sum -c customdist.iso.md5 from within the folder containing both the files, but this command gives the result: md5sum: customdist.iso.md5: no properly formatted MD5 checksum lines found Any ideas?

    Read the article

  • Utility or technique for swapping files quickly in Windows

    - by foraidt
    I frequently need to swap one file with another, without overwriting the original. Let's say there are two files, foo_new.dll and foo.dll. I usually rename them the follwing way: foo.dll - foo_old.dll, foo_new.dll - foo.dll, [do something with replaced file], foo.dll - foo_new.dll, foo_old.dll - foo.dll. This is ok for a single file to swap but it becomes tedious when swapping multiple files at once. Is there a Windows (7 and preferrably XP) utility or a technique that simplifies this task and works well when swapping multiple files? I'd prefer to be able to use it from within FreeCommander but Windows Explorer would be ok, too.

    Read the article

  • elffile: ELF Specific File Identification Utility

    - by user9154181
    Solaris 11 has a new standard user level command, /usr/bin/elffile. elffile is a variant of the file utility that is focused exclusively on linker related files: ELF objects, archives, and runtime linker configuration files. All other files are simply identified as "non-ELF". The primary advantage of elffile over the existing file utility is in the area of archives — elffile examines the archive members and can produce a summary of the contents, or per-member details. The impetus to add elffile to Solaris came from the effort to extend the format of Solaris archives so that they could grow beyond their previous 32-bit file limits. That work introduced a new archive symbol table format. Now that there was more than one possible format, I thought it would be useful if the file utility could identify which format a given archive is using, leading me to extend the file utility: % cc -c ~/hello.c % ar r foo.a hello.o % file foo.a foo.a: current ar archive, 32-bit symbol table % ar r -S foo.a hello.o % file foo.a foo.a: current ar archive, 64-bit symbol table In turn, this caused me to think about all the things that I would like the file utility to be able to tell me about an archive. In particular, I'd like to be able to know what's inside without having to unpack it. The end result of that train of thought was elffile. Much of the discussion in this article is adapted from the PSARC case I filed for elffile in December 2010: PSARC 2010/432 elffile Why file Is No Good For Archives And Yet Should Not Be Fixed The standard /usr/bin/file utility is not very useful when applied to archives. When identifying an archive, a user typically wants to know 2 things: Is this an archive? Presupposing that the archive contains objects, which is by far the most common use for archives, what platform are the objects for? Are they for sparc or x86? 32 or 64-bit? Some confusing combination from varying platforms? The file utility provides a quick answer to question (1), as it identifies all archives as "current ar archive". It does nothing to answer the more interesting question (2). To answer that question, requires a multi-step process: Extract all archive members Use the file utility on the extracted files, examine the output for each file in turn, and compare the results to generate a suitable summary description. Remove the extracted files It should be easier and more efficient to answer such an obvious question. It would be reasonable to extend the file utility to examine archive contents in place and produce a description. However, there are several reasons why I decided not to do so: The correct design for this feature within the file utility would have file examine each archive member in turn, applying its full abilities to each member. This would be elegant, but also represents a rather dramatic redesign and re-implementation of file. Archives nearly always contain nothing but ELF objects for a single platform, so such generality in the file utility would be of little practical benefit. It is best to avoid adding new options to standard utilities for which other implementations of interest exist. In the case of the file utility, one concern is that we might add an option which later appears in the GNU version of file with a different and incompatible meaning. Indeed, there have been discussions about replacing the Solaris file with the GNU version in the past. This may or may not be desirable, and may or may not ever happen. Either way, I don't want to preclude it. Examining archive members is an O(n) operation, and can be relatively slow with large archives. The file utility is supposed to be a very fast operation. I decided that extending file in this way is overkill, and that an investment in the file utility for better archive support would not be worth the cost. A solution that is more narrowly focused on ELF and other linker related files is really all that we need. The necessary code for doing this already exists within libelf. All that is missing is a small user-level wrapper to make that functionality available at the command line. In that vein, I considered adding an option for this to the elfdump utility. I examined elfdump carefully, and even wrote a prototype implementation. The added code is small and simple, but the conceptual fit with the rest of elfdump is poor. The result complicates elfdump syntax and documentation, definite signs that this functionality does not belong there. And so, I added this functionality as a new user level command. The elffile Command The syntax for this new command is elffile [-s basic | detail | summary] filename... Please see the elffile(1) manpage for additional details. To demonstrate how output from elffile looks, I will use the following files: FileDescription configA runtime linker configuration file produced with crle dwarf.oAn ELF object /etc/passwdA text file mixed.aArchive containing a mixture of ELF and non-ELF members mixed_elf.aArchive containing ELF objects for different machines not_elf.aArchive containing no ELF objects same_elf.aArchive containing a collection of ELF objects for the same machine. This is the most common type of archive. The file utility identifies these files as follows: % file config dwarf.o /etc/passwd mixed.a mixed_elf.a not_elf.a same_elf.a config: Runtime Linking Configuration 64-bit MSB SPARCV9 dwarf.o: ELF 64-bit LSB relocatable AMD64 Version 1 /etc/passwd: ascii text mixed.a: current ar archive, 32-bit symbol table mixed_elf.a: current ar archive, 32-bit symbol table not_elf.a: current ar archive same_elf.a: current ar archive, 32-bit symbol table By default, elffile uses its "summary" output style. This output differs from the output from the file utility in 2 significant ways: Files that are not an ELF object, archive, or runtime linker configuration file are identified as "non-ELF", whereas the file utility attempts further identification for such files. When applied to an archive, the elffile output includes a description of the archive's contents, without requiring member extraction or other additional steps. Applying elffile to the above files: % elffile config dwarf.o /etc/passwd mixed.a mixed_elf.a not_elf.a same_elf.a config: Runtime Linking Configuration 64-bit MSB SPARCV9 dwarf.o: ELF 64-bit LSB relocatable AMD64 Version 1 /etc/passwd: non-ELF mixed.a: current ar archive, 32-bit symbol table, mixed ELF and non-ELF content mixed_elf.a: current ar archive, 32-bit symbol table, mixed ELF content not_elf.a: current ar archive, non-ELF content same_elf.a: current ar archive, 32-bit symbol table, ELF 64-bit LSB relocatable AMD64 Version 1 The output for same_elf.a is of particular interest: The vast majority of archives contain only ELF objects for a single platform, and in this case, the default output from elffile answers both of the questions about archives posed at the beginning of this discussion, in a single efficient step. This makes elffile considerably more useful than file, within the realm of linker-related files. elffile can produce output in two other styles, "basic", and "detail". The basic style produces output that is the same as that from 'file', for linker-related files. The detail style produces per-member identification of archive contents. This can be useful when the archive contents are not homogeneous ELF object, and more information is desired than the summary output provides: % elffile -s detail mixed.a mixed.a: current ar archive, 32-bit symbol table mixed.a(dwarf.o): ELF 32-bit LSB relocatable 80386 Version 1 mixed.a(main.c): non-ELF content mixed.a(main.o): ELF 64-bit LSB relocatable AMD64 Version 1 [SSE]

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >