Search Results

Search found 10472 results on 419 pages for 'david hope ross'.

Page 6/419 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • Java Oracle Installation /usr/bin/java: cannot execute binary file

    - by Dave
    Hi I've been trying for 1 day to get Oracle Java running on Ubuntu. I have a powermac g5 with Ubuntu 12.04 ppc64. uname -a : Linux LK37 3.2.0-53-powerpc64-smp #81-Ubuntu SMP Thu Aug 22 21:17:14 UTC 2013 ppc64 ppc64 ppc64 GNU/Linux lspci: david@LK37:~$ sudo lspc [sudo] password for david: sudo: lspc: command not found david@LK37:~$ sudo lspci 0000:00:0b.0 PCI bridge: Apple Inc. CPC945 PCIe Bridge 0000:0a:00.0 VGA compatible controller: NVIDIA Corporation NV43 [GeForce 6600 LE] (rev a2) 0001:00:00.0 Host bridge: Apple Inc. U4 HT Bridge 0001:00:01.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-X bridge (rev a3) 0001:00:02.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-X bridge (rev a3) 0001:00:03.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:04.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:05.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:06.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:07.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:00:08.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:00:09.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:01:07.0 Unassigned class [ff00]: Apple Inc. Shasta Mac I/O 0001:01:0b.0 USB controller: NEC Corporation OHCI USB Controller (rev 43) 0001:01:0b.1 USB controller: NEC Corporation OHCI USB Controller (rev 43) 0001:01:0b.2 USB controller: NEC Corporation uPD72010x USB 2.0 Controller (rev 04) 0001:03:0c.0 IDE interface: Broadcom K2 SATA 0001:03:0d.0 Unassigned class [ff00]: Apple Inc. Shasta IDE 0001:03:0e.0 FireWire (IEEE 1394): Apple Inc. Shasta Firewire 0001:05:04.0 Ethernet controller: Broadcom Corporation NetXtreme BCM5780 Gigabit Ethernet (rev 03) 0001:05:04.1 Ethernet controller: Broadcom Corporation NetXtreme BCM5780 Gigabit Ethernet (rev 03) david@LK37:~$ I tried various ways to install Oracle Java but I always end up with: bash: /usr/bin/java: cannot execute binary file At the moment I have Installed jdk-7u25-linux-x64.tar.gz in /usr/lib/jvm/jdk1.7.0/bin/ as said in this post I already tried the web install but I get a 404 error. I hope you can help me. I started using Ubuntu yesterday so please give me the complete terminal code, it will be a lot easier for me. For those who care I want to play Minecraft and with the OpenJDK I got a java.lang error. That's why I want to install Oracle Java.

    Read the article

  • How to capture footage from an analog TV?

    - by hope
    I have only an old analog TV with just RCA connectors and a coaxial/cable in/out wire. It uses no HDMI interface, etc. I would like to know the cheapest method possible where the current video footage can be transferred and made into a target digital video file format and be copyable/transferrable via USB. Are there any devices that can do this with an analog TV? Basically, how can I get something along these lines done and for cheap?P I do NOT want to stream; I want footage recorded off the TV and stored via a digital encoding of a video file.

    Read the article

  • Article Sharing &ndash; Windows Azure Memcached Plugin

    - by Shaun
    I just found that David Aiken, a windows azure developer and evangelist, wrote a cool article about how to use Memcached in Windows Azure through the new feature Azure Plugin. http://www.davidaiken.com/2011/01/11/windows-azure-memcached-plugin/ I think the best solution for distributed cache in Azure would be the Windows Azure AppFabric Caching but since it’s only in CTP and avaiable in the US data center David’s solution would be the best. Only one thing I’m concerning about, is the stability of windows verion Memcached.

    Read the article

  • Java EE@Developer Day Poland

    - by reza_rahman
    Oracle Poland held a Developer Day in Warsaw on November 28. The event was a great success with 100+ attendees thanks to great speakers like Simon Ritter and David Delabassee. David led a lab on JAX-RS, HTML 5 Server-Sent Events and WebSocket using GlassFish (this is the same hands-on lab presented at JavaOne). The lab went extremely well with a full-house, enthusiastic crowd. Read more details here!

    Read the article

  • Ubuntu Server 12.04 Unable to locate package dhcp3-server

    - by Alex Hope O'Connor
    I have done a fresh install of Ubuntu Server version 12.04 and am trying to install a dhcp server package using the following command: sudo apt-get install dhcp3-server However I get this: Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package dhcp3-server Can someone tell me how to install this package on my server? Thanks, Alex.

    Read the article

  • What the best way to achieve RPO of zero and lowest possible RTO (less than 15 minutes) with SQL 2008 R2?

    - by Adrian Hope-Bailie
    We are running a payments (EFT transaction processing) application which is processing high volumes of transactions 24/7 and are currently investigating a better way of doing DB replication to our disaster recovery site. Our current and previous strategies have included using both DoubleTake and Redgate to replicate data to a warm stand-by. DoubleTake is the supported solution from the payments software vendor however their (DoubleTake's) support in South Africa is very poor. We had a few issues and simply couldn't ever resolve them so we had to give up on DoubleTake. We have been using Redgate to manually read the data from the primary site (via queries) and write to the DR site but this is: A bad solution Getting the software vendor hot and bothered whenever we have support issues as it has a tendency to interfere with the payment application which is very DB intensive. We recently upgraded the whole system to run on SQL 2008 R2 Enterprise which means we should probably be looking at using some of the built-in replication features. The server has 2 fairly large databases with a mixture of tables containing highly volatile transactional data and pretty static configuration data. Replication would be done over a WAN link to a separate physical site and needs to achieve the following objectives. RPO: Zero loss - This is transactional data with financial impact so we can't lose anything. RTO: Tending to zero - The business depends on our ability to process transactions every minute we are down we are losing money I have looked at a few of the other questions/answers but none meet our case exactly: SQL Server 2008 failover strategy - Log shipping or replication? How to achieve the following RTO & RPO with logshipping only using SQL Server? What is the best of two approaches to achieve DB Replication? My current thinking is that we should use mirroring but I am concerned that for RPO:0 we will need to do delayed commits and this could impact the performance of the primary DB which is not an option. Our current DR process is to: Stop incoming traffic to the primary site and allow all in-flight transaction to complete. Allow the replication to DR to complete. Change network routing to route to DR site. Start all applications and services on the secondary site (Ideally we can change this to a warmer stand-by whereby the applications are already running but not processing any transactions). In other words the DR database needs to, as quickly as possible, catch up with primary and be ready for processing as the new primary. We would then need to be able to reverse this when we are ready to switch back. Is there a better option than mirroring (should we be doing log-shipping too) and can anyone suggest other considerations that we should keep in mind?

    Read the article

  • dynamic insert php mysql and preformance

    - by Ross
    I have a folder/array of images, it may be 1, maximum of 12. What I need to do is dynamically add them so the images are added to an images table. At the moment I have $directory = "portfolio_images/$id/Thumbs/"; $images = glob("" . $directory . "*.jpg"); for ( $i= 0; $i <= count($images); $i += 1) { mysql_query("INSERT INTO project_images (image_name, project_id)VALUES ('$images[0]', '$id')") or die(mysql_error()); } this is fine but it does not feel right, how is this for performance? Is there a better way? The maximum number of images is only ever going to be 12. Thanks, Ross

    Read the article

  • How do you get credentials (NetworkCredential) of currently logged in user ?

    - by Ross
    Hi, I'm writing some code to utilise a 3rd party component, and I need to supply an object which implements ICredentials when I start to use it. If I write the following... var credential = new NetworkCredential("MyUsername", "MyPassword"); ...and pass "credential", it's fine. But I would like to pass the credentials of the current user (it's a Windows service, so runs as a specified user). I have tried both of the following, but neither appear to work (or return anything): NetworkCredential credential = System.Net.CredentialCache.DefaultCredentials; NetworkCredential credential = CredentialCache.DefaultNetworkCredentials; Can anyone suggest how to acquire an approriate object, which represents the credentials of the username that the service is running under ? Thanks, Ross

    Read the article

  • How to extract specific variables from a string?

    - by David
    Hi, let's say i have the following: $vars="name=david&age=26&sport=soccer&birth=1984"; I want to turn this into real php variables but not everything. By example, the functions that i need : $thename=getvar($vars,"name"); $theage=getvar($vars,"age"); $newvars=cleanup($vars,"name,age"); // Output $vars="name=david&age=26" How can i get only the variables that i need . And how i clean up the $vars from the other variables if possible? Thanks

    Read the article

  • Using normalize-string XPath function from SQL XML query ?

    - by Ross Watson
    Hi, is it possible to run an SQL query, with an XPath "where" clause, and to trim trailing spaces before the comparison ? I have an SQL XML column, in which I have XML nodes with attributes which contain trailing spaces. I would like to find a given record, which has a specified attribute value - without the trailing spaces. When I try, I get... "There is no function '{http://www.w3.org/2004/07/xpath-functions}:normalize-space()'" I have tried the following (query 1 works, query 2 doesn't). This is on SQL 2005. declare @test table (data xml) insert into @test values ('<thing xmlns="http://my.org.uk/Things" x="hello " />') -- query 1 ;with xmlnamespaces ('http://my.org.uk/Things' as ns0) select * from @test where data.exist('ns0:thing[@x="hello "]') != 0 -- query 2 ;with xmlnamespaces ('http://my.org.uk/Things' as ns0) select * from @test where data.exist('ns0:thing[normalize-space(@x)="hello "]') != 0 Thanks for any help, Ross

    Read the article

  • iPhone – Best method to import/drawing UI graphic elements? CGContextDrawPDFPage?

    - by Ross
    Hello, What is the best way to use the custom UI graphics on the iPhone? I've come across CGContextDrawPDFPage and Panic's Shrinkit. Should I be using storing my vector ui graphics as PDF's and loading them using CGContextDrawPDFPage to draw them. I did previously asked what way Apple store their UI graphics and was answered crushed png. The options as I see it, but I would really want to know what technique other people use. This question is for vector graphics only. Looking for what is standard / most effective / most efficient. PNG (bitmapped image) Custom UIView drawing code (generated from Opacity) PDF (I've not used this method, is it with CGContextDrawPDFPage?) Many thanks Ross

    Read the article

  • Adding values to a C# array

    - by Ross
    Probably a really simple one this - I'm starting out with C# and need to add values to an array, for example: int[] terms; for(int runs = 0; runs < 400; runs++) { terms[] = value; } For those who have used PHP, here's what I'm trying to do in C#: $arr = array(); for ($i = 0; $i < 10; $i++) { $arr[] = $i; } Thanks, Ross

    Read the article

  • Accessing Singleton Instance Variable in Class Methods Throws Warning?

    - by Ross
    Hello, I've using the Objective-C singleton from here at stackoverflow. The singleton in the class method accesses it's instance variable, which works, but throws a complie warning. How should I be doing this? Is there a way to do this without accessing the sharedInstance: in each class method? for example here is my class method: + (NSString *)myClassMethods { [instanceDateFormatter setFormat:@"MM"]; return [instanceDateFormatter stringWithDate:somedate]; } line 2 will have the complie warning. Thanks, Ross

    Read the article

  • iPhone – How to import/drawing UI graphic elements? CGContextDrawPDFPage?

    - by Ross
    Hello, What is the best way to use the custom UI graphics on the iPhone? I've come across CGContextDrawPDFPage and Panic's Shrinkit. Should I be using storing my vector ui graphics as PDF's and loading them using CGContextDrawPDFPage to draw them. I did previously asked what way Apple store their UI graphics and was answered crushed png. The options as I see it, but I would really want to know what technique other people use. This question is for vector graphics only. Looking for what is standard / most effective / most efficient. PNG (bitmapped image) Custom UIView drawing code (generated from Opacity) PDF (I've not used this method, is it with CGContextDrawPDFPage?) Many thanks Ross

    Read the article

  • Most effective way of drawing 4 lines in an iPhone app?

    - by Ross
    basic question, but I'm unsure. Not looking for code as an answer. I want to draw 4 short lines 1px lines on a view. What is the best way to approach this task? Options:- Load an image of the line, then create 4 UIImageViews with it. Create my own subclass of a UIView that draws a line in the draw rect method. Draw elsewhere on another view, another UIImageView that has an UIImage inside it (is this possible?) Another way? Thanks Ross

    Read the article

  • unicode data with custom font doesn't work properly in ipad

    - by David Ohanyan
    I am using custom font for label and string which I am getting from unicode characters. And the font is not changing. here is the snippet of my code: NSString* str = @"\u05D0\u05D1\u05D2"; [mMatchingLabel setText:str]; mMatchingLabel.font = [UIFont fontWithName:@"David New Hebrew" size:26]; But when I write for example : NSString* str = @"label"; [mMatchingLabel setText:str]; mMatchingLabel.font = [UIFont fontWithName:@"David New Hebrew" size:26]; The font effect is evident. Can someone explain what's here wrong?

    Read the article

  • MaxStartups and MaxSessions configurations parameter for ssh connections?

    - by Webby
    I am copying the files from machineB and machineC into machineA as I am running my below shell script on machineA. If the files is not there in machineB then it should be there in machineC for sure so I will try copying the files from machineB first, if it is not there in machineB then I will try copying the same files from machineC. I am copying the files in parallel using GNU Parallel library and it is working fine. Currently I am copying 10 files in parallel. Below is my shell script which I have - #!/bin/bash export PRIMARY=/test01/primary export SECONDARY=/test02/secondary readonly FILERS_LOCATION=(machineB machineC) export FILERS_LOCATION_1=${FILERS_LOCATION[0]} export FILERS_LOCATION_2=${FILERS_LOCATION[1]} PRIMARY_PARTITION=(550 274 2 546 278) # this will have more file numbers SECONDARY_PARTITION=(1643 1103 1372 1096 1369 1568) # this will have more file numbers export dir3=/testing/snapshot/20140103 find "$PRIMARY" -mindepth 1 -delete find "$SECONDARY" -mindepth 1 -delete do_Copy() { el=$1 PRIMSEC=$2 scp david@$FILERS_LOCATION_1:$dir3/new_weekly_2014_"$el"_200003_5.data $PRIMSEC/. || scp david@$FILERS_LOCATION_2:$dir3/new_weekly_2014_"$el"_200003_5.data $PRIMSEC/. } export -f do_Copy parallel --retries 10 -j 10 do_Copy {} $PRIMARY ::: "${PRIMARY_PARTITION[@]}" & parallel --retries 10 -j 10 do_Copy {} $SECONDARY ::: "${SECONDARY_PARTITION[@]}" & wait echo "All files copied." Problem Statement:- With the above script at some point I am getting this exception - ssh_exchange_identification: Connection closed by remote host ssh_exchange_identification: Connection closed by remote host ssh_exchange_identification: Connection closed by remote host And I guess the error is typically caused by too many ssh/scp starting at the same time. That leads me to believe /etc/ssh/sshd_config:MaxStartups and MaxSessions is set too low. But my question is on which server it is pretty low? machineB and machineC or machineA? And on what machines I need to increase the number? On machineA this is what I can find - root@machineA:/home/david# grep MaxStartups /etc/ssh/sshd_config #MaxStartups 10:30:60 root@machineA:/home/david# grep MaxSessions /etc/ssh/sshd_config And on machineB and machineC this is what I can find - [root@machineB ~]$ grep MaxStartups /etc/ssh/sshd_config #MaxStartups 10 [root@machineB ~]$ grep MaxSessions /etc/ssh/sshd_config #MaxSessions 10

    Read the article

  • Duplicate content issue after URL-change with 301-redirects

    - by David
    We got the following problem: We changed all URLs on our page from oldURL.html to newURL.html and set up 301-redirects (ca. 600 URLs) Google re-crawled our page, indexed all the new URLs (newURL.html), but didn't crawl the old URLs (oldURL.html) again, as there were no internal links pointing at those domains anymore after the URL-change. This resulted in massive ranking-drops, etc. because (i) Google thought oldURL.html has exactly the same content as newURL, causing duplicate content issues, and (ii) Google did not transfer the juice from oldURL to newURL, because the 301-redirect was never noticed. Now we reset all internal Links to the old URLs again, which then redirect to the newURLs, in the hope that Google would re-crawl the pages, once there are internal links pointing at them. This is partially happening, but at a really low speed, so it would take multiple months to notice all-redirects. I guess, because Google thinks: "Aah, I already know oldURL.html, so no need to re-crawl it. Possible solutions we thought of are ... Submitting as many of the old URLs to the index as possible via Webmaster Tools, to manually trigger a crawl. Doing that already Submitting a sitemap with all old URLs - but not sure if good idea, because Google does not seem to like 301-redirects in a sitemap ... Both solutions are not perfect - and we cannot wait for three months, just to regain our old rankings. What are your ideas? Best, David

    Read the article

  • The emergence of Atlassian's Bamboo (and a free SQL Source Control license offer!)

    - by David Atkinson
    The rise in demand for database continuous integration has forced me to skill-up in various new tools and technologies, particularly build servers. We have been using JetBrain's TeamCity here at Red Gate for a couple of years now, having replaced the ageing CruiseControl.NET, so it was a natural choice for us to use this for our database CI demos. Most of our early adopter customers have also transitioned away from CruiseControl, the majority to TeamCity and Microsoft's TeamBuild. However, more recently, for reasons we've yet to fully comprehend, we've observed a significant surge in the number of evaluators for Atlassian's Bamboo. I installed this a couple of weeks back to satisfy myself that it works seamlessly with Red Gate tools. As you would expect Bamboo's UI has the same clean feel found in any Atlassian tool (we use JIRA extensively here at Red Gate). In the coming weeks I will post a short step-by-step guide to setting up SQL Server continuous integration using the Red Gate command lines. To help us further optimize the integration between these tools I'd be very keen to hear from any Bamboo users who also use Red Gate tools who might be willing to participate in usability tests and other similar research in exchange for Amazon vouchers. If you are interested in helping out please contact me at David dot Atkinson at red-gate.com I recently spoke with Sarah, the product marketing manager for Bamboo, and we ended up having a detailed conversation about database CI, which has been meticulously documented in the form of a blog post on Atlassian's website: http://blogs.atlassian.com/2012/05/database-continuous-integration-redgate/ We've also managed to persuade Red Gate marketing to provide a great free-tool offer, provide a free SQL Source Control or SQL Connect license to Atlassian users provided it is claimed before the end of June! Full details are at the bottom of the post. Technorati Tags: sql server

    Read the article

  • Trying to format drive fails

    - by david
    since I will be doing an internship for which i need to use Windows software, I have decided to ruin my day trying to remove my Ubuntu 12.04, install Win XP SP3 (since the DualBoot theme from ubuntu suggests to first install Windows and then Ubuntu, for problems with the bootloader if you do it the other way around) and then reinstall Ubuntu 12.04 since I would like to keep using it as my primary operating system, using WinXP exclusively for the internship. Other than that, I would like to have a partition for the data, which can be used by both Ubuntu and Windows. So now, I have used the disk utility run from an ubuntu-live cd to format my drive with Master Boot Record (being conscious of the fact that this way I will lose all my data, which I have saved on an external drive before, and that my Ubuntu won't work anymore afterwards), creating partitions for Windows (NTFS), personal data (FAT, since as far as I know both Ubuntu and Windows can deal with this), a Swap partition for Linux, and one partition for Ubuntu (ext4); trying to install Win XP from cd gives me a blue screen, which stops the setup and telling me to remove all recently installed drives and to run CHKDSK. So I thought, that maybe Windows doesn't like pre-partitioned drives for its installation and thus I need to re-format my hard drive in order to have a completely "new" drive, which I can then, during the Windows-installation, partition in order to create the partitions I need. Trying to do this, though, the disk-utility run from the live-CD gives me this warning: Error creating partition table: helper exited with exit code 1: In part_create_partition_table: device_file=/dev/sda, scheme=0 got it got disk committed to disk BLKRRPART ioctl failed for /dev/sda: Device or resource busy I do not understand why it tells me that the hard-drive is busy, because, as stated above, I am doing all this from a live-CD. Thus, my questions are: How can I resolve the error given by the disk utility? Does it make sense to use four partitions in the way mentioned above? And if not so, which partitions should I create? Can I, theoretically, partition my drive from an Ubuntu live-cd in order to create the partitions I want and to install first Windows and then Ubuntu? Thanks for any help, David

    Read the article

  • Windows Intune, Cloud Desktop management

    - by David Nudelman
    As a part of Microsoft Cloud computing strategy, Windows Intune beta was released today. Here’s a quick overview of what customers and IT consultants can do with the cloud service component of Windows Intune: Manage PCs through web-based console: Windows Intune provides a web-based console for IT to administrate their PCs. Administrators can manage PCs from anywhere. Manage updates: Administrators can centrally manage the deployment of Microsoft updates and service packs to all PCs. Protection from malware: Windows Intune helps protect PCs from the latest threats with malware protection built on the Microsoft Malware Protection Engine that you can manage through the Web-based console. Proactively monitor PCs: Receive alerts on updates and threats so that you can proactively identify and resolve problems with your PCs—before it impacts end users and your business. Provide remote assistance: Resolve PC issues, regardless of where you or your users are located, with remote assistance. Track hardware and software inventory: Track hardware and software assets used in your business to efficiently manage your assets, licenses, and compliance. Set security policies: Centrally manage update, firewall, and malware protection policies, even on remote machines outside the corporate network. And here a quick video about Windows Intune For support and questions go to : TechNet Forums for Intune Regards, David Nudelman

    Read the article

  • The emergence of Atlassian's Bamboo (and a free SQL Source Control license offer!)

    - by David Atkinson
    The rise in demand for database continuous integration has forced me to skill-up in various new tools and technologies, particularly build servers. We have been using JetBrain's TeamCity here at Red Gate for a couple of years now, having replaced the ageing CruiseControl.NET, so it was a natural choice for us to use this for our database CI demos. Most of our early adopter customers have also transitioned away from CruiseControl, the majority to TeamCity and Microsoft's TeamBuild. However, more recently, for reasons we've yet to fully comprehend, we've observed a significant surge in the number of evaluators for Atlassian's Bamboo. I installed this a couple of weeks back to satisfy myself that it works seamlessly with Red Gate tools. As you would expect Bamboo's UI has the same clean feel found in any Atlassian tool (we use JIRA extensively here at Red Gate). In the coming weeks I will post a short step-by-step guide to setting up SQL Server continuous integration using the Red Gate command lines. To help us further optimize the integration between these tools I'd be very keen to hear from any Bamboo users who also use Red Gate tools who might be willing to participate in usability tests and other similar research in exchange for Amazon vouchers. If you are interested in helping out please contact me at David dot Atkinson at red-gate.com I recently spoke with Sarah, the product marketing manager for Bamboo, and we ended up having a detailed conversation about database CI, which has been meticulously documented in the form of a blog post on Atlassian's website: http://blogs.atlassian.com/2012/05/database-continuous-integration-redgate/ We've also managed to persuade Red Gate marketing to provide a great free-tool offer, provide a free SQL Source Control or SQL Connect license to Atlassian users provided it is claimed before the end of June! Full details are at the bottom of the post. Technorati Tags: sql server

    Read the article

  • DNS NAmeserver Aname and cname records

    - by David
    Hi - I am inexperienced in the configuration of DNS and have an issue with dominan hosting set up. I have two domains 'www.mydomain1.com' and 'www.mydomain2.com', with mydomain2 pointed at the same place as mydomain1. The domains were passed to me recently by the person who previoulsy controlled them. I have an account with fasthosts in the uk. When I accepted the domains I could not access the DNS settings and enquired with fasthosts as to why. The replied saying 'The delegate hosting option for both domains were enabled and this is the reason why you were unable to find the option to edit the advanced DNS records. I have now disabled the delegate hosting option so you can now edit the advanced DNS records for both domains in your account.' When i log into the fasthost control panel now i can access the DNS controls but both domains have no A Record of Cname record set up. I am concerned that fasthosts have blatted the previous Nameserver entries and set me up on theirs but not added any record. 'www.mydomain1.com' currently still works but 'www.mydomain2.com' does not find the site anymore. i am worried i will lose mydomain1 to as teh dns changes filter through the system. my webhosting is at 'xxx.xxx.xxx.xxx/mydomain1.com/' and this is where I want both domains to point. Any advice would be much appreciated. one thing which is confusing me is that because I am on a shared server I have to put 'xxx.xxx.xxx.xxx/mydomain1.com/' to get to my site rather than just 'xxx.xxx.xxx.xxx'. The form on fasthosts for the aname record only allows an IP to be entered - does it add the mydomain1.com/ onto the end itself? Thanks for any help given - I'm quite worried about this David

    Read the article

  • Découvrir la solution d'exploration de données structuré et non structuré

    - by David lefranc
    Explorer et découvrir l’information… Nous vous proposons un atelier découverte pour vous permettre d’explorer toute type de données grâce à la solution Oracle Endeca . Quand : 7 Décembre 2012 De 9h30 à 12h30  Lieu : Oracle 15 Boulevard Charles de gaulle 92715 Colombes Pour s'inscrire : David[email protected] Réalisé pour des utilisateurs métiers, cet atelier vous permettera en une demi journée , de découvrir Oracle Endeca Information Discovery afin de : Comprendre et explorer toute information venant de différents horizons ( Big Data, réseaux sociaux, forums, sondages, blogs..) Découvrir en quoi et comment OEID est un complément à des solutions de BI classiques Par une navigation simple et rapide, vous découvrirez combien il est facile de trouver des réponses à des questions imprévues en utilisant OEID sans formation préalable. Utilisez la recherche et la navigation guidée pour voir comment les informations structurées et non structurées peuvent être rapidement réunies pour dégager la valeur cachée. Explorer toutes vos données dans n'importe quel format et à partir de n'importe quelle source, y compris les médias sociaux, documents, fichiers,…. Pouvoir découvrir et explorer vos données sans référentiel pour permettre aux utilisateurs d’être autonome et d’analyser leurs propres données de manière rapide Élaborer une stratégie visant à accroître la valeur des données de l'entreprise tout en réduisant le coût total de possession Découvrez l'incroyable performance d’ Endeca sur Oracle Exalytics la machine In Memory AgendaAprès une introduction sur la solution Oracle information Endeca, suivi d’un atelier, vous verrez comment il est facile de: Utiliser la navigation guidée et le moteur de recherche pour explorer les données structurées et non structurées intégrer rapidement les nouvelles sources de données comme les médias sociaux Construire de nouvelles interfaces utilisateur tout en découvrant l’information répondre rapidement aux besoins changeants des entreprises et des environnements de données

    Read the article

  • pdflatex reads .eps files saved in OS/X, but not in Ubuntu

    - by David B Borenstein
    Sorry if this is a stupid question; I'm a newbie. I am preparing a manuscript in LaTeX. The journal (Physical Biology, an IOP publication) requires that figures be saved in .eps format, so I am trying to do that. However, I cannot get my LaTeX file to build when I have generated the .eps files on my Ubuntu computer. If I save the images on my Mac, the file build just fine. So far, I have tried saving images in ImageJ, FIJI and Inkscape. The same problem occurs in all three. When using kile, I get the following error: /usr/share/texmf-texlive/tex/latex/oberdiek/epstopdf-base.sty:0: Shell escape feature is not enabled. In TexWorks, the error is different, but still there: Package pdftex.def Error: File `./figures4/figure4a-eps-converted-to.pdf' not found. Now, if I fire up Inkscape, FIJI or ImageJ on OS/X, everything works fine. The Mac also can't build with the Ubuntu-saved images. The images generated on the Ubuntu machine open fine using Document Viewer. I am building the same LaTeX file on both computers, with the exact same results. The header of my LaTeX file is: \documentclass[12pt]{iopart} \usepackage{graphicx} \usepackage{epstopdf} \usepackage{parskip} \usepackage{color} \usepackage{iopams} And then the code for the figure is: \begin{figure} \center{\includegraphics[width=4in] {./figures4/figure4a.eps}} \footnotesize{\caption{ \label{fig:4a} (4a) lorem ipsum dolor sic amet.}} \end{figure} I'd be happy to send an example of both .eps files. Again, sorry if this is a dumb question. I tried everything I could think of before posting here. Thanks, David

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >