Search Results

Search found 7776 results on 312 pages for 'configure in'.

Page 194/312 | < Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >

  • What is the best way to restrict access to adult content on Ubuntu?

    - by Stephen Myall
    I bought my kids a PC and installed 12.04 (Unity) on it. The bottom line is, I want my children to use the computer unsupervised while I have confidence they cannot access anything inappropriate. What I have looked at: I was looking at Scrubit a tool which allows me configure my wifi router to block content and this solution would also protect my other PC and mobile devices. This may be overkill as I just want the solution to work on one PC. I also did some Google searches and came across the application called Nanny (it seems to look the part). My experience of OSS is that the best solutions frequently never appear first on a Google search list and in this case I need to trust the methods therefore my question is very specific. I want to leverage your knowledge and experience to understand “What is the best way to restrict adult content on 12.04 LTS” as this is important to me. It maybe a combination of things so please don't answer this question "try this or that", then give me some PPA unless you can share your experience of how good it is and of course if there are any contraints. Thanks in advance

    Read the article

  • Enabling support of EUS and Fusion Apps in OUD

    - by Sylvain Duloutre
    Since the 11gR2 release, OUD supports Enterprise User Security (EUS) for database authentication and also Fusion Apps. I'll plan to blog on that soon. Meanwhile, the R2 OUD graphical setup does not let you configure both EUS and FusionApps support at the same time. However, it can be done manually using the dsconfig command line. The simplest way to proceed is to select EUS from the setup tool, then manually add support for Fusion Apps using dsconfig using the commands below: - create a FA workflow element with eusWfe as next element: dsconfig create-workflow-element \           --set enabled:true \           --set next-workflow-element:Eus0 \           --type fa \           --element-name faWfe - modify the workflow so that it starts from your FA workflow element instead of Eus: dsconfig set-workflow-prop \           --workflow-name userRoot0 \           --set workflow-element:faWfe  Note: the configuration changes may slightly differ in case multiple databases/suffixes are configured on OUD.

    Read the article

  • Why maven so slow compared to automake?

    - by ???'Lenik
    I have a Maven project consists of around 100 modules. I have reason to decompose the project to so many modules, and I don't think I should merge them in order to speed up the build process. I have read a lot of projects by other people, e.g., the Maven project itself, and Apache Archiva, and Hudson project, they all consists of a lot of modules, nearly 100 maybe, more or less. The problem is, to build them all need so much time, 3 hours for the first time build (this is acceptable because a lot of artifacts to download), and 15 minutes for the second build (this is not acceptable). For automake, things are similarly, the first time you need to configure the project, to prepare the magical config.h file, it's far more complex then what maven does. But it's still fast, maybe 10 seconds on my Debian box. After then, make install requires maybe 10 minutes for the first time build. However, when everything get prepared, the .o object files are generated, they don't have to be rebuild at all for the second time build. (In Maven, everything rebuild at everytime.) I'm very wondering, how guys working for Maven projects can bare this long time for each build, I'm just can't sit down calmly during each time Maven build, it took too long time, really.

    Read the article

  • does ubuntu 11.10 is support ns2.29

    - by nasser
    I need your help, I'm a beginner on NS2 , and I'm trying to install ns2.29 on ubuntu 11.10 32bits but i can't. This message appear and installation stopped : Build tcl8.4.11 ============================================================ loading cache ./config.cache checking whether to use symlinks for manpages... no checking whether to compress the manpages... no checking whether to add a package name suffix for the manpages... no checking for gcc... gcc checking whether the C compiler (gcc ) works... yes checking whether the C compiler (gcc ) is a cross-compiler... no checking whether we are using GNU C... yes checking whether gcc accepts -g... yes checking for building with threads... no (default) checking if the compiler understands -pipe... yes checking how to run the C preprocessor... gcc -pipe -E checking for sin... no checking for main in -lieee... yes checking for main in -linet... no checking for net/errno.h... no checking for connect... yes checking for gethostbyname... yes checking how to build libraries... static checking for ranlib... ranlib checking if 64bit support is requested... no checking if 64bit Sparc VIS support is requested... no checking system version (for dynamic loading)... ./configure: 1: Syntax error: Unterminated quoted string tcl8.3.2 configuration failed! Exiting ... Tcl is not part of the ns project. Please see www.Scriptics.com to see if they have a fix for your platform. Anyone can help me?

    Read the article

  • Oracle Endeca Information Discovery 3.1 is Now Available

    - by p.anda
    Oracle Endeca Information Discovery (OEID) 3.1 is a major release that incorporates significant new self-service discovery capabilities for business users. These include agile data mashup, extended support for unstructured analytics, and an even tighter integration with Oracle BI This release is available for download from: Oracle Delivery Cloud Oracle Technology Network Some of the what's new highlights ... Self-service data mashup... enables access to a wider variety of personal and trusted enterprise data sources. Blend multiple data sets in a single app. Agile discovery dashboards... allows users to easily create, configure, and securely share discovery dashboards with intelligent defaults, intuitive wizards and drag-and-drop configuration. Deeper unstructured analysis ... enables users to enrich text using term extraction and whitelist tagging while the data is live. Enhanced integration with OBI... provides easier wizards for data selection and enables OBI Server as a self-service data source. Enterprise-class data discovery... offers faster performance, a trusted data connection library, improved auditing and increased data connectivity for Hadoop, web content and Oracle Data Integrator. Find out more ... visit the OEID Overview page to download the What's New and related Data Sheet PDF documents. Have questions or want to share details for Oracle Endeca Information Discovery?  The MOS Communities is a great first stop to visit and you can stop-by at MOS OEID Community.

    Read the article

  • Syncing sharepoint 2010 with outlook

    - by uruit
    Technorati Tags: Sharepoint 2010,Outlook SharePoint offers the possibility to connect to content in a Document Library directly from Outlook, edit the documents offline and then sync when connection is restored. This is very useful if we are working at home and we want to access a shared document (ex. VPN connection settings) or continue working directly on a file. Steps to configure the connection: 1. Browse online to SharePoint Document library you want to connect and click on "Connect to Outlook": (click to enlarge) 2. Click Allow to confirm: 3. In Outlook you will see the documents as outlook email items with the ability to preview them. When a document is updated, Outlook will notify you that you have items unread. If you want to edit a file, the corresponding office tool (Word, Excel, PowerPoint) will ask if you want to update the server after saving a change, it is really straightforward. (click to enlarge) 4. Finally, I recommend to add the IP address of your SharePoint server in the secure sites in order to prevent Outlook to ask for your windows credentials every time you open Outlook: (click to enlarge) Outlook is a great tool, letting you work in a really integrated way, don't miss this amazing feature. This feature is also available in SharePoint Online :)   Post by: Marcelo Martinez UruIT (www.uruit.com/sharepoint_outsourcing.html) Leaders in Nearshore Outsourcing from South America

    Read the article

  • apt-get doesnt download files from NFS location

    - by Pravesh
    I have switched to unix from last 3 months and trying to understand install process and in particular apt-get. I am able to successfully install and download the packages when I configure my repository on http location in /etc/apt/sources.list file. e.g. deb http://web.myspqce.com/u/eng/rose/debian-mirror-squeeze-amd64/mirror/ftp.us.debian.org/debian/ squeeze main contrib non-free This command will download(/var/cache/apt/archive) and install the package when i use apt-get install When I change the source location to file instead of http(nfs mount point), the package is getting installed but NOT getting downloaded in /var/cache/apt/archive. deb file:/deb_repository/debian-mirror-squeeze-amd64/mirror/ftp.us.debian.org/debian/ squeeze main contrib non-free Please let me know if there is any configuration or settings that i have to make to let apt-get to both download and install package when i use (nfs)file:/ instead of http:/ in sources.list. To achieve this, I can use apt-get --downlaod-only and then use apt-get install for both download and install in two separate calls, but I want to know why package is not getting downloaded with apt-get install but only getting installed when used with file:/ in sources.list

    Read the article

  • Why does Eclipse keep crashing and dpkg erroring out?

    - by Sanju Sony Kurian
    I AM GETTING THE ERROR E: Sub-process /usr/bin/dpkg returned an error code (1) WHILE I WAS TRYING TO UPGRADE MY PC. i USE ECLIPSE AND IT KEEPS ON CRASHING... pLEASE HELP I ATTACH THE ERROR sanju@sanju-Dell-System-XPS-L502X:~$ upgrade Reading package lists... Done Building dependency tree Reading state information... Done The following packages have been kept back: aisleriot gir1.2-rb-3.0 gir1.2-totem-1.0 gnome-disk-utility gnome-keyring libgck-1-0 libgcr-3-1 libtotem0 linux-generic linux-headers-generic linux-image-generic rhythmbox rhythmbox-data rhythmbox-mozilla rhythmbox-plugin-cdrecorder rhythmbox-plugin-magnatune rhythmbox-plugin-zeitgeist rhythmbox-plugins seahorse totem totem-common totem-mozilla totem-plugins 0 upgraded, 0 newly installed, 0 to remove and 23 not upgraded. 1 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? Y Setting up grub-pc (1.99-21ubuntu3.1) ... /var/lib/dpkg/info/grub-pc.config: 35: /etc/default/grub: Syntax error: EOF in backquote substitution dpkg: error processing grub-pc (--configure): subprocess installed post-installation script returned error exit status 2 Errors were encountered while processing: grub-pc E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

  • Correct configuration of multiple Analytics trackers per page, spanning domains and subdomains

    - by Eliot Shepard
    My company publishes sites on a somewhat convoluted domain structure, and we're having trouble getting accurate numbers in Analytics when we have multiple trackers on the page. We publish under two brands (A, B). Each brand has a "national" site at A.com, B.com, as well as per-city "local" sites at eg. ny.A.com, la.A.com, sf.A.com, etc. Right now we're trying to track in these dimensions: Full network (A.com, ny.A.com, B.com, la.B.com, etc.) All sites in brand (A.com, ny.A.com, la.A.com, etc.) Inidividual site (ny.A.com) Here are the commands we're using on an individual site: _gaq.push( ['t0._setAccount', 'UA-XXXXXX-1'], // full network ['t0._setDomainName', 'none'], ['t0._setAllowLinker', true], ['t0._trackPageview'], ['t1._trackPageLoadTime'], ['t1._setAccount', 'UA-XXXXXX-2'], // brand ['t1._setDomainName', 'none'], ['t1._setAllowLinker', true], ['t1._trackPageview'], ['t1._trackPageLoadTime'], ['t2._setAccount', 'UA-XXXXXX-3'], // individual ['t2._setDomainName', 'none'], ['t2._setAllowLinker', true], ['t2._trackPageview'], ['t2._trackPageLoadTime'] ); We send the same commands to each account because we've had strange results when trackers were configured differently in the past. However, right now we're seeing inflated numbers for uniques on all three trackers. What is the correct way to configure this setup? Thanks for your time.

    Read the article

  • Ubuntu 12.04 VPN default route issue when using the UI

    - by Pieter Pabst
    When setting up a VPN connection using the UI (System settings = Network) the VPN connection is used as the default route after connecting to it. How can I avoid this? That is without assigning manual addresses. I want the connection to use DHCP. It should use the eth* interfaces for my default route, like they are used before making the connection. The VPN route should only be used when connecting to addresses in the ppp0 adapters range. If possible using the UI only, not that I'm not used to editing config files... But the point of using Ubuntu for me was the "it just works" concept (which is true for 99% of the time, keep up the good work). I tried setting the "Method" combobox in the "Configure..." dialogue to "Automatic (VPN) addresses only". That doesn't make a difference. (Another thing I noticed; the UI shows my connection as a "Wireless connection" in the list control on the left. Don't know why... it has the right icon, but that's probably something for the bug tracker)

    Read the article

  • What is the proper way to install the 3.4 kernel?

    - by Marcelo Ruiz
    Kernel 3.2 has an annoying error for my wireless card (rtl8192se-b) that makes the connection drop and/or prevents the card to make a connection to the wireless router. Dealing with it was very frustrating until I found out the bug was corrected in 3.4. I downloaded: linux-headers-3.4.0-030400_3.4.0-030400.201205210521_all.deb linux-headers-3.4.0-030400_3.4.0-030400.201205210521_all.deb linux-image-3.4.0-030400-generic_3.4.0-030400.201205210521_amd64.deb and installed with: sudo dpkg -i * Now the wireless works fine, but I have two problems that cannot solve. The first one is minor: plymouth would not start at all. But if I boot with the 3.2 kernel it works fine. The second one is serious: sometimes the computer won't shut down or reboot. The X server terminates but the computer shows part of my grub background and will stay there forever using 100% of the CPU. I have a Toshiba Qosmio with an Core i7 and nvidia graphic card (using nvidia-current). During one shutdown, I briefly read a message that said that the virtualbox module couldn't be unloaded from the kernel. I tried to solve this by removing and purging virtualbox and installing it back. I don't see the message anymore, but sometimes the computer won't shutdown nor reboot. Am I missing something to properly configure the new kernels? Thanks!

    Read the article

  • Design mode in Visual studio 2010 sp1 beta

    - by anirudha
    in MVC3 razor we found that their is no way to watch the design in Visual studio as well we can see aspx file by going to design mode. their is a little trick to solve this issue first is that if you have Expression web or Vs 2008 then open the file on them how see here in expression web 4 you need to add the extension .cshtml and open them as html as we open other. in Visual studio 2008 you need to add the extension .cshtml and set them open as html. well their is no big trouble if design not worked. but in some case when you need to get this issue solved this need to be work for configuration do this: Expression Web 4 > tools > application options > configure editor > click on new extension icon show in exact left put the cshtml in the window they show you and choose expression web [open as html] after that you can see the design in expression web. by default Expression web not have cshtml as known extension so you need to do that to add them because without it they never handle cshtml file and refer them to Visual studio. after setting this you  EW4 open the cshtml file as html and show you design in design mode. in Visual studio 2008 you can use same trick to solve this issue just follow this step > options > text editors > file extension put cshtml in textbox and set the option html editor from dropdown and click on add and ok this will open your cshtml file as html.

    Read the article

  • Dell XPS 15 L502x and Ubuntu 11.04 - HDMI output

    - by Jones
    Recently I've bought my dream's notebook, a Dell XPS 15 but since then this dream became a kind of endless nightmare. I'm almost getting crazy to make my graphic card driver work properly, but it seems to be just impossible. Yes, I have a 2GB NVIDIA GeForce GT 540m (Optimus) in it! It simply doesn't work. Every time I generate the xorg.conf Ubuntu hangs on while starting up, which forces me to remove this file to be able to start the notebook with the standard graphic settings. Another problem is that the Dell XPS 15 does NOT have a VGA output, but a HDMI. So, to be able to use a second monitor I have to configure it by the NVIDIA X Server Settings, which just works if the driver is properly initialized with the xorg.conf. I've also tried to make it work with the Bumblebee, but unfortunately it didn't help me much with the HDMI output. Do you guys have any idea to solve this deadlock? Is there any way for me to use my second monitor?

    Read the article

  • When not to use Spring to instantiate a bean?

    - by Rishabh
    I am trying to understand what would be the correct usage of Spring. Not syntactically, but in term of its purpose. If one is using Spring, then should Spring code replace all bean instantiation code? When to use or when not to use Spring, to instantiate a bean? May be the following code sample will help in you understanding my dilemma: List<ClassA> caList = new ArrayList<ClassA>(); for (String name : nameList) { ClassA ca = new ClassA(); ca.setName(name); caList.add(ca); } If I configure Spring it becomes something like: List<ClassA> caList = new ArrayList<ClassA>(); for (String name : nameList) { ClassA ca = (ClassA)SomeContext.getBean(BeanLookupConstants.CLASS_A); ca.setName(name); caList.add(ca); } I personally think using Spring here is an unnecessary overhead, because The code the simpler to read/understand. It isn't really a good place for Dependency Injection as I am not expecting that there will be multiple/varied implementation of ClassA, that I would like freedom to replace using Spring configuration at a later point in time. Am I thinking correct? If not, where am I going wrong?

    Read the article

  • Configuring WS-Security with PeopleSoft Web Services

    - by Dave Bain
    I was speaking with a customer a few days ago about PeopleSoft Web Services.  The customer created a web service but when they went to deploy it, they had so many problems configuring ws-security, they pulled the service.  They spent several days trying to get it working but never got it working so they've put it on hold until they have time to work through the issues. Having gone through the process of configuring ws-security myself, I understand the complexity.  There is no magic 'easy' button to push.  If you are not familiar with all the moving parts like policies, certificates, public and private keys, credential stores, and so on, it can be a daunting task.  PeopleBooks documentation is good but does not offer a step-by-step example to follow.  Fear not, for those that want more help, there is a place to go. PeopleSoft released a Mobile Inventory Management application over a year ago.  It is a mobile app built with Oracle Fusion Application Development Framework (ADF) that accesses PeopleSoft content through standard web services.  Part of the installation of this app is configuring ws-security for the web services used in the application.  Appendix A of the PeopleSoft FSCM91 Mobile Inventory Management Installation Guide is called Configuring WS-Security for Mobile Inventory Management.  It is a step-by-step guide to configure ws-security between a server running Oracle Web Server Management (OWSM) and PeopleSoft Integration Broker.  Your environment might be different, but the steps will be similar, and on the PeopleSoft side, Integration Broker will remain a constant. You can find the installation guide on Oracle Suport.  Sign in to https://support.us.oracle.com and search for document 1290972.1.  Read through Appendix A for more details about how to set up ws-security with PeopleSoft web services.

    Read the article

  • How to install rgdal on Ubuntu 12.10?

    - by radek
    I'm strugling to install rgdal library on Ubuntu 12.10. Installation from within R results in error: Error: gdal-config not found The gdal-config script distributed with GDAL could not be found. If you have not installed the GDAL libraries, you can download the source from http://www.gdal.org/ If you have installed the GDAL libraries, then make sure that gdal-config is in your path. Try typing gdal-config at a shell prompt and see if it runs. If not, use: --configure-args='--with-gdal-config=/usr/local/bin/gdal-config' with appropriate values for your installation. ERROR: configuration failed for package ‘rgdal’ * removing ‘/home/rdk/R/x86_64-pc-linux-gnu-library/2.15/rgdal’ Warning in install.packages : installation of package ‘rgdal’ had non-zero exit status R-sig-Geo, this two SE questions and other websites pointed me to the requirements of libgdal1-dev. But when I tried sudo apt-get install libgdal1-dev I end up with another error message Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: libgdal1-dev : Depends: libgdal-dev but it is not going to be installed E: Unable to correct problems, you have held broken packages. Again - when I try to install libgdal-dev another dependencies error shows up The following packages have unmet dependencies: libgdal-dev : Depends: libgeos-dev but it is not going to be installed Depends: libspatialite-dev but it is not going to be installed Again trying libgeos-dev gives message: Depends: libgeos-c1 (= 3.3.3-1.1) but 3.3.3-2~precise2 is to be installed E: Unable to correct problems, you have held broken packages. and libspatialite-dev: Depends: libspatialite3 (= 3.1.0~rc2-1ubuntu1) but 3.1.0~rc2-2~precise1 is to be installed Is there any way to tame those dependencies and have rgdal running in Ubuntu? My sessionInfo() R version 2.15.1 (2012-06-22) Platform: x86_64-pc-linux-gnu (64-bit)

    Read the article

  • A Generic RIDC Test Program

    - by Kevin Smith
    Many times I have found it useful to use a java program that communicates with WebCenter Content (WCC) using RIDC for testing. I might not have access to the web GUI or need to test a service running as a specific user. In the past I had created a number of "one off" programs that submitted specific services, e.g GET_SEARCH_RESULTS, DOCINFO, etc. Recently I decided to create a generic RIDC test program that could submit any service with the desired parameters based on a configuration file. The programs gets the following information from the configuration file: WCC connection information (host, port) User to use to run service Service to run Any parameters for the service The program will make a connection to the WCC server, send the service request, and print the results of the service call using the getResponseAsString() method. Here is a sample configuration file: ridc.host=localhostridc.port=4444ridc.user=sysadminridc.idcservice=GET_SEARCH_RESULTSidcservice.QueryText=dDocType <matches> `Document`idcservice.SortField=dDocNameidcservice.SortDesc=ASC There is a readme file included in the zip with instructions for how to configure and run the program. The program takes one command line argument, the configuration file name. The configuration file name is optional and defaults to config.properties. If you have any suggestions for improvements let me know. Right now it only submits a single service call each time you run it. One enhancement I have already thought about would be to allow you to specify multiple services to tun in the configuration file. You can do that with the current program by having multiple configuration files and running the program multiple times, each with a different configuration file. You can download the program here.

    Read the article

  • Workaround for an Xcode/iOS SDK Issue...

    - by Joe Huang
    Hi, everyone: When you are doing ADF Mobile development, and you need to deploy the application to an iOS device, you would need to compile/deploy the app with iOS App Certificates and Provisioning Profile. This means you would need to "Deploy to Package" or "Deploy to iTunes" during deployment, and configure JDeveloper with the proper certificates/profiles. In some instances (exact combination is still not clear), deploy and signing the application to generate the ipa file may fail with similar error message at the end of the deployment log: [01:04:45 PM] Deployment failed due to one or more errors returned by '/usr/bin/xcrun'. The following is a summary of the returned error(s): Command-line execution failed (Return code: 1) error: /usr/bin/codesign --force --preserve-metadata=identifier,entitlements,resource-rules --sign iPhone Distribution: Oracle Corporation --resource-rules=/var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/Payload/HelloWorld.app/ResourceRules.plist --entitlements /var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/entitlements_plistEINPBkIG /var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/Payload/HelloWorld.app failed with error 1. Output: /var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/Payload/HelloWorld.app: replacing existing signature Program /usr/bin/codesign returned 1 : [/var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/Payload/HelloWorld.app: replacing existing signature This issue is a known issue and is not related to ADF Mobile. The workaround is discussed in this article from StackOverflow. This article refers to the old location of Xcode, so you would need to adjust the paths accordingly. The path for Xcode 4.3 and above would be like: /Applications/Xcode.app/Contents//Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/PackageApplication to this script file. To modify it, you probably can’t use Text Editor. I end up opening a terminal session, changed the file permission, and used vi to update it. Thanks, Oracle ADF Mobile Product Management Team

    Read the article

  • ArchBeat Link-o-Rama for 10-19-2012

    - by Bob Rhubart
    One Week to Go: OTN Architect Day Los Angeles - Oct 25 Oracle Technology Network Architect Day in Los Angeles happens in one week. Register now to make sure you don't miss out on a rich schedule of expert technical sessions and peer interaction covering the use of Oracle technologies in cloud computing, SOA, and more. Even better: it's all free. Register now! When: October 25, 2012, 8:30am - 5:00pm. Where: Sofitel Los Angeles, 8555 Beverly Boulevard, Los Angeles, CA 90048. Moving your APEX app to the Oracle Cloud | Dimitri Gielis Oracle ACE Director (and OSN Developer Challenge co-winner) Dimitri Gielis shares the steps in the process as he moves his "DGTournament" application, along with all of its data, onto the Oracle Cloud. A brief note for customers running SOA Suite on AIX platforms | A-Team - SOA "When running Oracle SOA Suite with IBM JVMs on the AIX platform, we have seen performance slowdowns and/or memory leaks," says Christian, an architect on the Oracle Fusion Middleware A-Team. "On occasion, we have even encountered some OutOfMemoryError conditions and the concomittant Java coredump. If you are experiencing this issue, the resolution may be to configure -Dsun.reflect.inflationThreshold=0 in your JVM startup parameters." Introducing the New Face of Fusion Applications | Misha Vaughan Oracle ACE Directors Debra Lilly and Floyd Teter have already blogged about the the new face of Oracle Fusion Applications. Now Applications User Experience Architect Misha Vaughan shares a brief overview of how the Oracle Applications User Experience (UX) team developed the new look. ADF Essentials Security Implementation for Glassfish Deployment | Andrejus Baranovskis According to Oracle ACE Director Andrejus Baranovskis, Oracle ADF Essentials includes all the key ADF technologies, save one: ADF Security. In this post he illustrates a solution for filling that gap. Thought for the Day "Why are video games so much better designed than office software? Because people who design video games love to play video games. People who design office software look forward to doing something else on the weekend." — Ted Nelson Source: softwarequotes.com

    Read the article

  • Sharing swap space between Windows and Ubuntu

    - by Leftium
    This Linux Swap Space Mini-HOWTO describes how to share swap space between Windows and Linux. Do these instructions still apply to Ubuntu in 2011? How should I modify the steps for Ubuntu? Is there a better approach to sharing swap space? Based on the HOWTO, it seems best to create a dedicated NTFS swap partition: Dedicated so the swap file will be contiguous and remain unfragmented. NTFS so both Windows and Ubuntu can read/write to it. (Or is FAT32 better for this purpose?) Then, configure Ubuntu to prepare the swap space for use by Linux on start up; by Windows on shut down. I want to dual boot Ubuntu and Windows 7 on my X301 laptop. However, my laptop only has a 64 GB SDD, so I would like to conserve as much disk space as possible. update: There is an alternate method using a special driver for Windows that let you use a Linux swap partition for temporary storage like a RAM-disk, but it doesn't seem to be as good...

    Read the article

  • How do I add unmet dependencies for unity-lens-music autogen.sh?

    - by nickform
    I would like to build unity-lens-music on my newly-upgraded Ubuntu 12.10 machine. I followed these instructions from the unity website to get the code. The README was empty but I guessed that ./autogen.sh would be a sensible place to start. Unfortunately it exits with the following error: checking for LENS_DAEMON... no configure: error: Package requirements (glib-2.0 >= 2.27 gobject-2.0 >= 2.27 gio-2.0 >= 2.27 gio-unix-2.0 >= 2.27 dee-1.0 >= 1.0.7 sqlite3 >= 3.7.7 gee-1.0 json-glib-1.0 unity >= 6.90.0 unity-extras >= 6.90.0 tdb >= 1.2.6) were not met: No package 'dee-1.0' found No package 'sqlite3' found No package 'gee-1.0' found No package 'json-glib-1.0' found No package 'unity' found No package 'unity-extras' found No package 'tdb' found When I attempt to satisfy the dependencies that aren't found using apt-get install I either find that there is no exact match (e.g. 'dee-1.0' which matches several packages), I already have the latest version (e.g. sqlite3, unity) or there is no match at all (e.g. unity-extras and tdb). There is a later suggestion to modify PKG_CONFIG_PATH if I have installed software in a non-standard location but, to my knowledge, I have not. How should I proceed?

    Read the article

  • Oracle VM Server for x86 Training Schedule

    - by Antoinette O'Sullivan
    Learn about Oracle VM Server for x86 to see how you can accelerate enterprise application deployment and simplify lifecycle management with fully integrated support from physical to virtual servers including applications. Oracle offers a 3 day course Oracle VM Administration: Oracle VM Server for x86. This course  teaches you how to: Build a virtualization platform using the Oracle VM Manager and Oracle VM Server for x86. Deploy and manage highly configurable, inter-connected virtual machines. Install and configure Oracle VM Server for x86 as well as details of network and storage configuration, pool and repository creation, and virtual machine management. You can take this course as follows: Live Virtual Class - taking the course from your own desktop accessing a live teach by top Oracle instructors and accessing extensive hands-on exercises. No travel necessary. Over 300 events are current scheduled in many timezones across the world. See the full schedule by going to the Oracle University Portal and clicking on Virtualization. In Classroom - Events scheduled include those shown below:  Location  Date  Delivery Language  Warsaw, Poland  6 August 2012  Polish  Istanbul, Turkey  10 September 2012  Turkish  Dusseldorf, Germany  6 August 2012  German  Munich, Germany  10 September 2012  German  Paris, France  17 October 2012  French  Denver, Colorado, US  30 July 2012  English  Roseville, Minnesota, US  23 July 2012  English  Sydney, Australia  3 September  English For more information on this course and these events or to search for additional events or register your interest in an additional event/location, please visit the Oracle University Portal and click on Virtualization.

    Read the article

  • .htaccess url rewriting problem

    - by letsworktogether
    I'm kind of stuck at this part and was hoping that I'd get some assistance. I'm building a highscores page in PHP, that's going great, it works. however, I dislike the idea of "index.php?skill=name" and therefore wanted a bit of SEO in this. I have successfully replaced the url with a more friendly version: "highscores/skill/name" And this is where the problem starts, I have added pagination to the highscores and the page is read from the HTTP_GET page variable ($_GET['page']). I dislike the idea of "highscores/skill/name&page=2" and was hoping if you guys could assist me to make the url like the following: Page 1, so accessing the file without declaring the page number: DOMAIN.TLD/highscores/skill/name Page 1 so now the page variable is needed:DOMAIN.TLD/highscores/skill/name/2 As you can tell the "2" will define page 2 and load the correct data for page 2. However, I'm having much trouble in my .htaccess file to configure it this way. RewriteRule ^highscores\/skill\/(.*?)(\/(.*?)*)$ highscores/skills.php?skill=$1&page=$2 [L] # Skills page That is my latest attempt in order to get it to work, unfortunately it does not work, it makes the page look horrible (CSS doesn't work) and it doesn't go to the page specified on the URL. I hope you understand my issue, thank you!

    Read the article

  • .NET and SMTP Configuration

    - by koevoeter
    Sometimes I feel stupid about discovering .NET features that have been there since an old release (2.0 in this case)... Apparently you can just use this configSecion “mailSettings” and never have to configure your SmtpClient instance in code again (no, not hard-coded): <system.net>     <mailSettings>         <smtp deliveryMethod="Network" from="My Display Name &lt;[email protected]&gt;">             <network host="mail.server.com" />         </smtp>     </mailSettings> </system.net> Now you can go all like: new SmtpClient().Send(mailMessage); …and everything is configured for you, even the from address (which you can obviously override).

    Read the article

  • "From Russia with Love" - My Oracle Russian Experience

    - by cwarticki
    Two weeks ago, I traveled to Moscow, Russia. I had the pleasure of meeting with many of our Oracle Partners and Customers in the region.  I also worked with our Oracle Russia team throughout the week building many new friendships. The showcase for the week was an Oracle Support Strategy event for our Oracle Partners and Customers.  It was held at the Kateria-City Hotel, Moscow.  The Oracle Marketing team did an amazing job registering 100+ for the event, and nearly 100 were in attendance.           During the event, I spoke about many different topics. Part was a hands-on workshop to personalize your MOS Dashboard and configure Hot-Topics Email alerts.  Customers learned how to subscribe to newsletters and other Oracle information.  It covered a mulitude of Support Best Practices.  Additionally, I presented Platinum Services to the audience and my colleague Kristophe Hermans, from Oracle Belgium spoke on Proactive Support. In addition, I had the distinct privilege to meet one-on-one with our customers representing OJSC VimpelCom, MTC-Rus and Sberbank.  Pictured with me is Valery Yourinsky, Director of Technology Consulting Dept, FORS Distribution (Oracle Platinum Partner) Finally, I spent 2.5 days with my Oracle colleagues from Oracle Russia. They are super, hard-working, dedicated, customer-service professionals. All of them! I owe them all a debt of gratitude. Next time, we meet in Florida - ok? I am very appreciative to all our Oracle partners, customers and colleagues.  Thanks for hosting me and showing me a wonderful time in your country.  I look forward to my return. Sincerely,Chris WartickiGlobal Customer Management

    Read the article

< Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >