Search Results

Search found 11734 results on 470 pages for 'refer to itself'.

Page 93/470 | < Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >

  • WebCenter 11.1.1.8 Certified with E-Business Suite 12.2

    - by Steven Chan (Oracle Development)
    Oracle WebCenter Suite is an integrated suite of Fusion Middleware 11gR1 tools used to create web sites and portals using service-oriented architecture (SOA).  Applications adapters are also available. WebCenter Portal 11.1.1.8 is now certified with Oracle E-Business Suite Release 12.2.  This complements our existing certifications of WebCenter Portal 11.1.1.8 with EBS 12.0 and 12.1.  WebCenter Portal 11.1.1.8 is part of Oracle Fusion Middleware 11g Release 1 Version 11.1.1.8.0, also known as FMW 11gR1 Patchset 7.  Certified Platforms Oracle WebCenter Portal is certified to run on any operating system for which Oracle WebLogic Server 11g is certified. For information on operating systems supported by Oracle WebLogic Server 11g and Oracle WebCenter Portal, refer to the 'Oracle Fusion Middleware on WebLogic Server - System Certification' in the Oracle Fusion Middleware 11g Release 1 (11.1.1.x) Certification Matrix. Integration with Oracle WebCenter Portal involves components spanning several different suites of Oracle products. There are no restrictions on which platform any particular component may be installed so long as the platform is supported for that component. Migrating to Oracle WebCenterIf you're currently using Oracle Portal, you should be aware that Portal is now in maintenance mode.  Updates with bug fixes will continue to be produced, but you should consider migrating to Oracle WebCenter for ongoing new features. References Using WebCenter 11.1.1 with Oracle E-Business Suite Release 12.2 (Note 1332645.1) WebCenter Portal 11g Release 1 (11.1.1.8) Documentation Related Articles Oracle E-Business Suite 12.2 Now Available WebCenter Portal 11.1.1.8 Certified with E-Business Suite 12

    Read the article

  • Any empirical evidence on the efficacy of CMMI?

    - by mehaase
    I am wondering if there are any studies that examine the efficacy of software projects in CMMI-oriented organizations. For example, are CMMI organizations more likely to finish projects on time and/or on budget than non-CMMI organizations? Edit for clarification: CMMI stands for "Capability Maturity Model Integration". It's developed by the Software Engineering Institute at Carnegie-Mellon University (SEI-CMU). It's not a certification, but there are various companies that will "appraise" your organization to various levels of CMMI, such as level 2 and level 3. (I believe CMMI level 1 is an animalistic, Hobbesian free-for-all that nobody aspires to. In other words, everybody is at least CMMI level 1, even if you've never heard of CMMI before.) I'm definitely not an expert, but I believe that an organization can be appraised for CMMI levels within different scopes of work: i.e. service delivery, software development, foobaring, etc. My question is focused on the software development appraisal: is an organization that has been appraised to CMMI Level X for software projects more likely to finish a software project on time and on budget than another organization that has not been appraised to CMMI Level X? However, in the absence of hard data about software-oriented CMMI, I'd be interested in the effect that CMMI appraisals have on other activities as well. I originally asked the question because I've seen various studies conducted on software (e.g. the essays in The Mythical Man Month refer to numerous empirical studies, as does McConnell's Code Complete), so I know that there are organizations performing empirical studies of software development.

    Read the article

  • How to remove the last character from Stringbuilder

    - by hmloo
    We usually use StringBuilder to append string in loops and make a string of each data separated by a delimiter. but you always end up with an extra delimiter at the end. This code sample shows how to remove the last delimiter from a StringBuilder. using System; using System.Collections.Generic; using System.Text; using System.Linq; class Program { static void Main() { var list =Enumerable.Range(0, 10).ToArray(); StringBuilder sb = new StringBuilder(); foreach(var item in list) { sb.Append(item).Append(","); } sb.Length--;//Just reduce the length of StringBuilder, it's so easy Console.WriteLine(sb); } } //Output : 0,1,2,3,4,5,6,7,8,9 Alternatively,  we can use string.Join for the same results, please refer to blow code sample. using System; using System.Collections.Generic; using System.Text; using System.Linq; class Program { static void Main() { var list = Enumerable.Range(0, 10).Select(n => n.ToString()).ToArray(); string str = string.Join(",", list); Console.WriteLine(str); } }

    Read the article

  • How-To: Run CMSDK against a RAC cluster

    - by frank.closheim
    Using CMSDK in a production environment often requires a robust, reliable and failover enabled repository. When using Oracle Real Application Cluster (RAC) with your CMSDK repository you need to have a specific configuration in place to support such a setup. This post will explain the configuration steps required when running CMSDK 9.0.4.6 with Oracle WebLogic Server (WLS).In the previous CMSDK 9.0.4.2 version a RAC enabled connect string looked like this: (DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = rac1)(PORT = 1521))(ADDRESS = (PROTOCOL = TCP)(HOST = rac2)(PORT = 1521))(LOAD_BALANCE = NO)(FAILOVER = ON)(CONNECT_DATA =(SERVICE_NAME = rac)(failover_mode = (type=select)(method=basic)))CMSDK 9.0.4.6 makes use of data sources to connect to the underlying database. These data sources are configured inside your Application Server, such as Oracle WebLogic Server.In Oracle WebLogic Server 10.3.4, a single data source implementation has been introduced to support an RAC cluster. It responds to Fast Application Notification (FAN) events to provide Fast Connection Failover (FCF), Runtime Connection Load-Balancing (RCLB), and RAC instance graceful shutdown. XA affinity is supported at the global transaction Id level. The new feature is called WebLogic Active GridLink for RAC; which is implemented as the GridLink data source within WebLogic Server.This GridLink data source also works with Oracle Single Client Access Name (SCAN). SCAN is a feature used in RAC environments that provides a single name for clients to access any Oracle Database running in a cluster. You can think of SCAN as a cluster alias for databases in the cluster. The benefit is that the client’s connect information does not need to change if you add or remove nodes or databases in the cluster.The CMSDK 9.0.4.6 documentation describes how to create a regular JDBC data source named jdbc/OracleDS. Please refer to the following document which describes in detail how to create a GridLink data source in WLS.

    Read the article

  • OSB, Service Callouts and OQL

    - by Sabha
    Oracle Fusion Middleware customers use Oracle Service Bus (OSB) for virtualizing Service endpoints and implementing stateless service orchestrations. Behind the performance and speed of OSB, there are a couple of key design implementations that can affect application performance and behavior under heavy load. One of the heavily used feature in OSB is the Service Callout pipeline action for message enrichment and invoking multiple services as part of one single orchestration. Overuse of this feature, without understanding its internal implementation, can lead to serious problems. This series will delve into OSB internals, the problem associated with usage of Service Callout under high loads, diagnosing it via thread dump and heap dump analysis using tools like ThreadLogic and OQL (Object Query Language) and resolving it. The first section in the series will mainly cover the threading model used internally by OSB for implementing Route Vs. Service Callouts. The second section of the "OSB, Service Callouts and OQL" blog posting will delve into thread dump analysis of OSB server and detecting threading issues relating to Service Callout and using Heap Dump and OQL to identify the related Proxies and Business services involved. The final section of the series will focus on the corrective action to avoid Service Callout related OSB serer hangs. Before we dive into the solution, we need to briefly discus about Work Managers in WLS. Please refer to the blog posting for more details.

    Read the article

  • Time Zone on WebLogic Server

    - by adejuanc
    In order to configure the time zone with WebLogic Server, use the following JVM startup command: -Duser.timezone=<timezone> For example, in the java arguments in the admin console at Environments -> Servers -> Servername -> - Server Start tab, configure the startup settings that Node Manager will use to start the particular server. For example: -Duser.timezone='America/Arizona' There are many different time zones, each with its own code. For a complete list please refer to : http://en.wikipedia.org/wiki/List_of_zoneinfo_time_zones For testing, you can run the following code on WLS with a JSP, servlet, or deploying the class: import java.util.Calendar; import java.util.TimeZone; public class TestTimeZone {  public static void main(String[] args) {    Calendar calendar = Calendar.getInstance();    TimeZone timeZone = calendar.getTimeZone();    System.out.println(" your Current TimeZone is : " + timeZone.getDisplayName());    System.out.println(" Time Zone id : "+ timeZone.getID());  } }

    Read the article

  • Cannot connect to wireless network

    - by smr
    I am quite new to using Ubuntu 12.10. All I remember while installing Ubuntu 12.10 was that I didn't check any box specifically to enable wireless connectivity. I tried looking into the various questions posted here - most of them refer to certain commands (redundantly sudo) and things of that sort. I am quite new and wasn't sure of understanding what those commands were trying to do (the network configuration settings - showed up different options, like 'Wireless', 'Wired', 'IPv4', 'IPv6' etc -- I am not sure of what sort of settings are to be made - like for example what sort of 'WEP' settings etc are to be made. Although I wasn't able to connect to the wireless network, I was able to connect to the internet, once I plugged in the ethernet cable. Can someone help me with: 1) Understandng a way to see if Ubuntu is configured to connect to a wireless network. 2) If so - Getting to configure to a wireless network (setting up a new one). 3) Where in ubuntu, should I look for seeing if there are any hardware devices (such as the Broadcom wireless adapter or things of that sort - like the way we use the 'Device manager' in Windows to look into the adapter settings). 4) Reference material to learn and understand the various ubuntu commands. (things like the -lshw etc).

    Read the article

  • i18n and L10n (1)

    - by Aaron Li
    Internationalization (i18n) is a way of designing and developing a software product to function in multiple locales. This process involves identifying the locales that must be supported, designing features which support those locales, and writing code that functions equally well in any of the supported locales. Localization (L10n) is a process of modifying or adapting a software product to fit the requirements of a particular locale. This process includes (but may not be limited to) translating the user interface, documentation and packaging, changing dialog box geometries, customizing features (if necessary), and testing the translated product to ensure that it still works (at least as well as the original). i18n is a pre-requisite for L10n. Resource is 1. any part of a program which can appear to the user or be changed or configured by the user. 2. any piece of the program's data, as opposed to its code. Core product is the language independent portion of a software product (as distinct from any particular localized version of that product - including the English language version). Sometimes, however, this term is used to refer to the English product as opposed to other localizations.   Useful links http://www.mozilla.org/docs/refList/i18n/ http://www.w3.org/International/ http://hub.opensolaris.org/bin/view/Community+Group+int_localization/

    Read the article

  • LDoms with Solaris 11

    - by Orgad Kimchi
    Oracle VM Server for SPARC (LDoms) release 2.2 came out on May 24. You can get the software, see the release notes, reference manual, and admin guide here on the Oracle VM for SPARC page. Oracle VM Server for SPARC enables you to create multiple virtual systems on a single physical system.Each virtual system is called alogical domain and runs its own instance of Oracle Solaris 10 or Oracle Solaris 11. The version of the Oracle Solaris OS software that runs on a guest domain is independent of the Oracle Solaris OS version that runs on the primary domain. So, if you run the Oracle Solaris 10 OS in the primary domain, you can still run the Oracle Solaris 11 OS in a guest domain, and if you run the Oracle Solaris 11 OS in the primary domain, you can still run the Oracle Solaris 10 OS in a guest domain In addition to that starting with the Oracle VM Server for SPARC 2.2 release you can migrate guest domain even if source and target machines have different processor type. You can migrate guest domain from a system with UltraSPARC  T2+ or SPARC T3 CPU to a system with a SPARC T4 CPU.The guest domain on the source and target system must run Solaris 11 In order to enable cross CPU migration.In addition to that you need to change the cpu-arch property value on the source system. For more information about Oracle VM Server for SPARC (LDoms) with Solaris 11 and  Cross CPU Migration refer to the following white paper

    Read the article

  • Classes as a compilation unit

    - by Yannbane
    If "compilation unit" is unclear, please refer to this. However, what I mean by it will be clear from the context. Edit: my language allows for multiple inheritance, unlike Java. I've started designing+developing my own programming language for educational, recreational, and potentially useful purposes. At first, I've decided to base it off Java. This implied that I would have all the code be written inside classes, and that code compiles to classes, which are loaded by the VM. However, I've excluded features such as interfaces and abstract classes, because I found no need for them. They seemed to be enforcing a paradigm, and I'd like my language not to do that. I wanted to keep the classes as the compilation unit though, because it seemed convenient to implement, familiar, and I just liked the idea. Then I noticed that I'm basically left with a glorified module system, where classes could be used either as "namespaces", providing constants and functions using the static directive, or as templates for objects that need to be instantiated ("actual" purpose of classes in other languages). Now I'm left wondering: what are the benefits of having classes as compilation units? (Also, any general commentary on my design would be much appreciated.)

    Read the article

  • Trouble installing Java

    - by BRKsays
    I am running Ubuntu 12.04 LTS. I wanted to install Java and so I downloaded the 32-bit self extracting .bin file from http://www.java.com and tried to install it according to their instruction. First I made the file an executable one. Then created /usr/java/. After that I have to run this command: ./jre-7u<version>-linux-i586.bin. But I'm stuck here. My Java version is Java 6 u32. When I enter the command it says "no such file or directory". What to do? Please help. Also I'm trying to install 32-bit Java on my 64-bit Precise. Could that possibly be the problem? I tried to follow second answer by Jonas Christensen. I tried to open it, it says file is an unknown type. I tried the terminal command: ./jre-6u31-linux-i586.bin. But it gave this: Unpacking... Checksumming... Extracting... ./jre-6u32-linux-i586.bin: 86: ./jre-6u32-linux-i586.bin: ./install.sfx.5736: not found Failed to extract the files. Please refer to the Troubleshooting section of the Installation Instructions on the download page for more information.

    Read the article

  • VMware Player 5.0 or VMware Workstation 9.0 after upgrade to Ubuntu 12.10

    The upgrade process Upgrading Ubuntu 12.04 to latest version 12.10 - aka Quantal Quetzal - is straight forward and you only need to follow the offical upgrade instructions. Short version on the console looks like this: sudo do-release-upgrade This will update the repository entries, and start the upgrade process. After some minutes or hours of download and installation, you have to reboot your system once to get the new kernel loaded. As time of writing, I'm on '3.5.0-17-generic'. And as with any modification of the kernel version, you have to compile the necessary kernel modules to get VMware Player or Workstation up and running. Usually, this happens the first time you try start your VMware software and that's it. Well, again not so this time. Getting the kernel patch Luckily, the community over VMware is very active and you can get a new kernel patch in the online forums here. Get the download and put in a folder have write permissions. Then you extract the archive on the console like so: tar -xjvf vmware9_kernel35_patch.tar.bz2 Then you change into the newly created folder: cd vmware9_kernel3.5_patch/ And you execute the available shell script as root (superuser) like so: sudo ./patch-modules_3.5.0.sh This will stop any running instances of VMware software, patches the source files and runs the compile process for your active environment. This might take some time depending on your machine, and once completed you can start VMware Player or Workstation as previously. In case that you are going to apply the patch again, the script will simply quit with the following output: /usr/lib/vmware/modules/source/.patched found. You have already patched your sources. Exiting You might remove the .patched file in case that you upgraded/changed your kernel and you need to apply the patch again. Disclaimer: The patch is "as-is" and the patcher is originally created by Artem S. Tashkinov, and later modified by An_tony. Please refer to the VMware forum in case of questions or problems. There are also patches available for older versions of VMware Player or Workstation.

    Read the article

  • Internationalization messages based in views or in model entities

    - by SJuan76
    I have a small webapp in java and I am adding the internationalization support, replacing texts with labels that are defined in dictionary files. While some texts are obviously unique to each view (v.g. the html title), other refer to concepts from the model (v.g. a ticket, the location or status of such ticket, etc.) As usual, some terms will appear many times in different locations (vg, in both the edition page and in the search page and in the listings I have a "ticketLocation" label). My question is: can I organize the labels around the model concepts (so I have a ticket.location label and I use it everywhere such field is labeled) or should I make a different label for each (so form.ticketLocation and filter.ticketLocation and list.ticketLocation). I would go for the first option; I have searched for tips and the only thing that I see that could hinder me is due to the length of the string disrupting the design, and even for that I would prefer having to add a ticket.locationShort for places where there is not much space. What is your opinion/tips/experience?

    Read the article

  • Why does printf report an error on all but three (ASCII-range) Unicode Codepoints, yet is fine with all others?

    - by fred.bear
    The 'printf' I refer to is the standard-issue "program" (not the built-in): /usr/bin/printf I was testing printf out as a viable method of convert a Unicode Codepoint Hex-literal into its Unicoder character representation, I was looking good, and seemed flawless..(btw. the built-in printf can't do this at all (I think)... I then thought to test it at the lower extreme end of the code-spectrum, and it failed with an avalanche of errors.. All in the ASCII range (= 7 bits) The strangest thing was that 3 value printed normally; they are: $ \u0024 @ \u0040 ` \u0060 I'd like to know what is going on here. The ASCII character-set is most definitely part of the Unicode Code-point sequence.... I am puzzled, and still without a good way to bash script this particular converion.. Suggestions are welcome. To be entertained by that same avalanche of errors, paste the following code into a terminal... # Here is one of the error messages # /usr/bin/printf: invalid universal character name \u0041 # ...for them all, run the following script ( for nib1 in {0..9} {A..F}; do for nib0 in {0..9} {A..F}; do [[ $nib1 < A ]] && nl="\n" || nl=" " $(type -P printf) "\u00$nib1$nib0$nl" done done echo )

    Read the article

  • Using 2d collision with 3d objects

    - by Lyise
    I'm planning to write a fairly basic scrolling shoot 'em up, however, I have run into a query with regards to checking for collision. I plan to have a fixed top down view, where the player and enemies are all 3d objects on a fixed plane, and when the enemy or player fires at the other, their shots will also be along this fixed plane. In order to handle the collision, I have read up a bit on collision detection in 3d, as it is not something I have looked into previously, but I'm not sure what would be ideal for this situation. My options appear to be: Sphere collision, however, this lacks the pixel precision I would like Detection using all vertexes and planes of each object, but this seems overly convoluted for a fixed plane of play Rendering the play screen in black and white (where white is an object, black is empty space), once for enemies and once for the player, and checking for collisions that way (if a pixel is white on both, there is a collision) Which of these would be the best approach, or is there another option that I am missing? I have done this previously using 2d sprites, however I can't use the same thinking here as I don't have the image to refer to.

    Read the article

  • 2012 EC Election Results

    - by heathervc
    The 2012 Fall Executive Committee Election process is now complete. The ballot closed at midnight pacific time on Monday, 29 October.  Congratulations to Cinterion Wireless Modules GmbH, Credit Suisse, Fujitsu Limited, Hewlett-Packard (all four candidates were ratified),  and CloudBees and London Java Community (two elected candidates) as the new and re-elected merged EC Members. For more information regarding the merged EC, please refer to JSR 355 and the JCP 2.9 Process Document. Ratified Seats: Cinterion Wireless Modules GmbH, Credit Suisse, Fujitsu Limited and Hewlett-Packard Open Election Seats: CloudBees and London Java Community Newly elected EC Members take their seats on Tuesday, 13 November 2012. Detailed Election Results:Number of Eligible Voters: 1131 Percent of Eligible Members Casting Votes: 23.70% Ratified Seats: Candidate Yes Votes (%) No Votes (%) Abstentions Cinterion Wireless Modules GmbH 156 (76) 48 (24) 64 Credit Suisse 205 (85) 35 (15) 28 Fujitsu Limited 195 (85) 35 (15) 38 Hewlett-Packard 182 (81) 44 (19) 42 Open Election Seats: The top two candidates have been elected. Candidate Votes (%) Cisco Systems 37 (7) CloudBees 101 (20) Giuseppe Dell'Abate 6 (1) Liferay, Inc. 35 (7) London Java Community 164 (33) MoroccoJUG 46 (9) North Sixty-One Ltd 26 (5) Software AG 19 (4) ZeroTurnaround 64 (13) None of the above 5 (1) For more information about the candidates who ran for the 2012 Executive Committee Election, please visit the 2012 Executive Committee Elections Nominees page.

    Read the article

  • Does (should?) changing the URI scheme name change the semantics?

    - by Doug
    If we take: http://example.com/foo is it fair to say that: ftp://example.com/foo .. points to the same resource, just using a different mechanism for resolving it (and of course possibly a different representation, but perhaps not)? This came to light in a discussion we were having surrounding some internal tooling with Git. We have to process some Git repositories, and they come to use as "git@{authority}/{path}" , however the library we're using to interface with them doesn't support the git protocol. I suggested that we should make the service robust in of that it tries to use HTTP or SSH, in essence, discovering what protocols/schemes are supported for resolving the repository at {path} under each {authority}. This was met with some criticism: "We don't know if that's the same repository". My response was: "It had better be!" Looking at RFC 3986, I see this excerpt: URI "resolution" is the process of determining an access mechanism and the appropriate parameters necessary to dereference a URI; this resolution may require several iterations. To use that access mechanism to perform an action on the URI's resource is to "dereference" the URI. Which makes me think that the resolution process is permitted to try different protocols, because: Although many URI schemes are named after protocols, this does not imply that use of these URIs will result in access to the resource via the named protocol. The only concern I have, I guess, is that I only see reference to the notion of changing protocols when it comes to traversing relationships: it is possible for a single set of hypertext documents to be simultaneously accessible and traversable via each of the "file", "http", and "ftp" schemes if the documents refer to each other with relative references. I'm inclined to think I'm wrong in my initial beliefs, because the Normalization and Comparison section of said RFC doesn't mention any way of treating two URIs as equivalent if they use different schemes. It seems like schemes named/based on IP protocols ought to have this notion, at least?

    Read the article

  • Phishing attack stuck with jsp loginAction.do page? [closed]

    - by user970533
    I 'm testing a phishing website on a staged replica of an jsp web-application. I'm doing the usual attack which involves changing the post and action field of source code to divert to my own written jsp script capture the logins and redirect the victim to the original website. It looks easy but trust me its has been me more then 2 weeks I cannot write the logins to the text file. I have tested the jsp page on my local wamp server it works fine. In staged when I click on the ok button for user/password field I'm taken to loginAction.do script. I checked this using tamper data add on on firefox. The only way I was able to make my script run was to use burp proxy intercept the request and change action parameter to refer my uploaded script. I want to know what does an loginAction.do? I have googled it - its quite common to see it in jsp application. I have checked the code; there is nothing that tells me why the page always point to the .do script instead of mine. Is there some kind of redirection in tomcat configuration. I like to know. I'm unable to exploit this attack vector? I need the community help

    Read the article

  • determine an application's process name on linux (ubuntu)

    - by Jacob
    This is the situation: Working on (the next version of) a Unity quicklist editor, I would like to add a reliable way of "restarting" launcher icons. To do so, I need to remove the icon (editing gsettings) and replace it on the same position. So far no problem. However, if the application in question is running, user will possibly lose data, as the application will quit when it's icon is removed from the launcher. What I need is a reliable way to find an application's process name, to let the editor check in the list of running processes if the application is running, and send a warning message to the user that the icon can not be restarted if the application is running. What i did so far is make the editor look into the desktop file, to read the command, also read the command, stripped from the directory section, and furthermore look into possible remote scripts the desktop file command might refer to, looking for strings starting with "./" Although te method seems to work well with all applications I tested it on, I have the feeling there must be an easier way to cover the problem in an "all in one" way... Is there? also suggestions to catch more exceptional situations are welcome!

    Read the article

  • how to use a batch file to delete a line of text in a bunch of text files? [on hold]

    - by wbt
    I have a bunch of txt files in my D drive which are placed randomly in different locations.Some files also contain symbols.I want a batch file so that i can delete their specific lines completely at the same time without doing it one by one for each file and please refer to a code which does not create a new text file at some other location with the changes being incorporated i.e I do not want the input.txt and output.txt thing.I just need the original files to be replaced with the changes as soon as i click the batch file. e.g D:\abc\1.txt D:\xyz\2.txt etc I want both of their 3rd lines erased completely with a single click and the new file must be saved with the same name in the same location i.e the new changed text files must replace the old text files with their respective lines removed.Maybe some sort of *.txt thing i.e i should be able to change all the files with the .txt extensions in a drive via a single batch file perhaps in another drive,not placing my batch file into each and every folder separately and then running them.Alternatively a vbs file is also welcomed. SORRY FOR THE LONG AND THOROUGH MESSAGE BUT I'M WONDERING ALL OVER THE INTERNET FOR THE LAST TWO DAYS JUST FOR THIS ONE BATCH FILE.ALL THE INFORMATION I GET IS A SORT OF JARGON FOR ME AS I AM NOT A GEEK WITH THE SCRIPTING.PLEASE DESCRIBE THE CODE TOO.YOUR HELP IS MUCH APPRECIATED

    Read the article

  • Fix: Windows Live Mail Error ID 0x8004108D

    - by Ken Cox [MVP]
    Over the last few days Windows Live Mail stopped fetching my Hotmail. I assumed it was a problem with Microsoft’s NNTP Web service, so I went back to using the browser to check my Hotmail. Well, it didn’t right itself, so I started probing and discovered the fix… no default connection. To repair it: From the Tools menu, click Accounts. On the Accounts screen select Hotmail and then click Properties. On the Connection tab select Local Area Network and check the Always Connect to This Account Using...(read more)

    Read the article

  • Can't install libopenssl-ruby

    - by Oetzi
    I'm trying to install ruby on rails and I can't seem to install libopenssl-ruby. I'm on a VPS and I've installed Jaunty as the newer releases don't seem to work very nicely. When I do: apt-get install libopenssl-ruby I get: E: Package libopenssl-ruby has no installation candidate Originally it simply said that it couldn't find the package but after wget'ing a deb form here: http://linuxappfinder.com/package/libopenssl-ruby and trying to install using dpkg I get this new error. Dpkg itself said that it couldn't install my deb as it depended on 'libopenssl-ruby'. Currently my sources.lst is this: ###### Ubuntu Main Repos deb http://us.archive.ubuntu.com/ubuntu/ hardy main restricted universe multiverse deb-src http://us.archive.ubuntu.com/ubuntu/ hardy main restricted universe multiverse Does anyone know what might be wrong?

    Read the article

  • Stark Expo Needs You

    - by [email protected]
    Train to Become a Master Cloud Operative Can't wait until September to get your Oracle fix? Then come visit us at the Stark Expo now. Marvel Entertainment has turned itself into one of the hottest media companies of the digital age, and at the heart of Marvel's growth and transformation is Oracle technology. Now, this successful collaboration finds its way to the big screen, as Oracle joins forces with Marvel to launch a special showcase Website and movie trailer for the upcoming Iron Man 2. In Iron Man 2, Oracle is a proud sponsor of Stark Expo, a world-class tradeshow that depends on a cloud computing architecture to ensure that systems are free from overload. Starting today, visitors to the showcase Website are invited to become Master Cloud Operatives and keep Stark Expo up and running. Complete your training, test your troubleshooting skills in the Oracle Pavilion, and qualify to receive a free movie poster.

    Read the article

  • The Mysterious ARR Server Farm to URL Rewrite link

    Application Request Routing (ARR) is a reverse proxy plug-in for IIS7+ that does many things, including functioning as a load balancer.  For this post, Im assuming that you already have an understanding of ARR.  Today I wanted to find out how the mysterious link between ARR and URL Rewrite is maintained.  Let me explain ARR is unique in that it doesnt work by itself.  It sits on top of IIS7 and uses URL Rewrite.  As a result, ARR depends on URL Rewrite to catch the traffic...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Status of stack based languages

    - by Andrea
    I have recently become curious about Factor, which, as far as I understand, is the most practical stack based language around. Forth seems not to be used much these days - I think it is because it was meant to be used on its own, instead of inside an operating system, although ports of course exist. It is also pretty low level. Joy is essentially dead, as the author stated that it does not make sense to mantain it in spite of adopting Factor. The fact is that Factor itself does not seem much developed today. The GitHub repo does not seem very active, and a lot of stuff languishes in unmantained. So, are there any other languages of this type that are more actively mantained? Are any in production use?

    Read the article

< Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >