Search Results

Search found 20168 results on 807 pages for 'service'.

Page 236/807 | < Previous Page | 232 233 234 235 236 237 238 239 240 241 242 243  | Next Page >

  • pound: multiple domains

    - by niklassaers
    Hi guys, I've been using pound to run mydomain.dk. Now I've bought some other domains and SSL certificates that are mydomain.no, mydomain.se and mydomain.eu. My old config looked roughly like this: ListenHTTPS Address 81.19.246.120 Port 443 Cert "/usr/local/etc/pound.keys/mydomain.dk.pem" Service BackEnd Address 10.0.10.10 Port 8080 End End End At places like here I've seen that I can use HeadRequire in the Service part, but I want the Host header to go together with the Cert, ideally something like ListenHTTPS Address 81.19.246.120 Port 443 HostAndCert "mydomain.dk" "/usr/local/etc/pound.keys/mydomain.dk.pem" HostAndCert "mydomain.se" "/usr/local/etc/pound.keys/mydomain.se.pem" HostAndCert "mydomain.no" "/usr/local/etc/pound.keys/mydomain.no.pem" HostAndCert "mydomain.eu" "/usr/local/etc/pound.keys/mydomain.eu.pem" Service BackEnd Address 10.0.10.10 Port 8080 End End End Any suggestions or clues to how I can accomplish this? Cheers Nik

    Read the article

  • Oracle OpenWorld 2013: First glimpses of the new SOA Suite 12c by Lucas Jellema

    - by JuergenKress
    During this week’s Oracle OpenWorld Conference, we were given some sneak peeks into the short term future of the Oracle SOA Suite. During various roadmap sessions, on the demo grounds as well as in the keynote session by Thomas Kurian (the replay of which you can see here, new features were described and demonstrated, allowing us to get a fairly good overview of what is going to come for SOA Suite - later in 2013 and sometime in 2014 (probably the first half of that year). The SOA Suite plays an important part in the three themes Thomas Kurian set down for the Fusion Middleware suite of products: support for mobility, cloud and business user empowerment. Some of the highlighted new aspects of Oracle SOA Suite are: Adapters to connect from on-premise to in-the-cloud – specifically targeting SalesForce, RightNow and also providing an SDK to create custom integrations into the cloud (the first cloud adapters will be released on 11g, before the end of the year) Mobile enablement by exposing RESTful services that communicate using JSON as well as adding the capability to call out to such services (12c functionality) Enhanced functionality on Exalogic (of course it runs faster on Exalogic, up to 20 times) Modular runtime with a lighter footprint. A brief demonstration of the Cloud Adapter was given by Demed L’Her during said keynote. The next screenshot shows the Adapter wizard for the Cloud Adapter. It allows the developer to pick a specific operation for a specific business object exposed by RightNow (or SalesForce) (the adapter knows about the APIs exposed by RightNow and SalesForce): This next screenshot shows the adapter that is used in SOA Suite 12c to expose a RESTful service on top of an SCA Composite or a Service Bus service: Read the full article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: Amis,Lucas Jellema,SOA Suite 12c,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Ubuntu reboot suddenly

    - by Gladiator
    Its the second day I have this issue, and Ubuntu still reboot suddenly. nothing significatif in syslog. salim@SalimPC:~$ tail -f /var/log/syslog<br> Nov 7 12:34:53 SalimPC dbus[873]: [system] Successfully activated service 'com.ubuntu.SystemService' SalimPC dbus[873]: [system] Activating service name='org.freedesktop.PackageKit' (using servicehelper) SalimPC AptDaemon: INFO: Initializing daemon SalimPC AptDaemon.PackageKit: INFO: Initializing PackageKit compat layer SalimPC dbus[873]: [system] Successfully activated service 'org.freedesktop.PackageKit' SalimPC AptDaemon.PackageKit: INFO: Initializing PackageKit transaction SalimPC AptDaemon.Worker: INFO: Simulating trans:/org/debian/apt/transaction/6933b4b977d944fa8714898c01bfeae4<br> SalimPC AptDaemon.Worker: INFO: Processing transaction org/debian/apt/transaction/6933b4b977d944fa8714898c01bfeae4 SalimPC AptDaemon.PackageKit: INFO: Get updates() Nov 7 12:34:58 SalimPC AptDaemon.Worker: INFO: Finished transaction /org/debian/apt/transaction/6933b4b977d944fa8714898c01bfeae4 ---------------------------------Previous post------------------ Hi My ubuntu has rebooted suddenly (2 time till now in one hour). After login, a crash was indicated in /usr/sbin/ntop. below are the syslog and a screenshot of the crash. salim@SalimPC:~$ tail /var/log/syslog Nov 6 18:25:38 SalimPC ntop[1630]: **WARNING** packet truncated (9642->8232) Nov 6 18:25:38 SalimPC ntop[1630]: **WARNING** packet truncated (8274->8232) Nov 6 18:25:38 SalimPC ntop[1630]: **WARNING** packet truncated (11010->8232) Nov 6 18:25:38 SalimPC ntop[1630]: **WARNING** packet truncated (17850->8232) Nov 6 18:25:38 SalimPC ntop[1630]: **WARNING** packet truncated (8274->8232) Nov 6 18:25:39 ntop[1630]: last message repeated 2 times Nov 6 18:25:39 SalimPC ntop[1630]: **WARNING** packet truncated (16482->8232) Nov 6 18:25:40 SalimPC ntop[1630]: **WARNING** packet truncated (11010->8232) Nov 6 18:25:43 SalimPC ntop[3075]: THREADMGMT[t3063068672]: ntop RUNSTATE: PREINIT(1) Nov 6 18:25:43

    Read the article

  • Cannot activate windows 7

    - by Charlie
    I can't activate Windows 7, I get an error saying DNS name does not exist. Is something configured incorrectly somewhere? PS: I need the answer within 13 days ;) UPDATE: I had upgraded from my company's Windows Vista build, and when I connect to the company VPN and activate I get a different error: The Software Licensing Service reported that the computer could not be activated. No Key Management Service (KMS) could be contacted. Please see the Application Event Log for additional information. The Application Event Log contains this (I took out the server name, it's one of the company servers): The client has sent an activation request to the key management service machine. Info: 0xC004F042, 0x00000000, xxxxxxxxxxxxxxxxxxxxxxxxx.com:1688, 7fbdc9b7-d654-49ed-80a7-81a34408f8dc, 2009/09/01 10:59, 0, 2, 17880, ae2ee509-1b34-41c0-acb7-6d4650168915, 25

    Read the article

  • Designing application flow

    - by Umesh Awasthi
    I am creating a web application in java where I need to mock the following flow. When user trigger a certain process (add product to cart), I need to pass through following steps Need to see in HTTP Session if user is logged in. Check HTTP Session if shopping cart is there If user exist in HTTP Session and his/her cart do not exist in HTTP Session Get user cart from the database. add item to cart and save it to HTTP session and update cart in DB. If cart does not exist in DB, create new cart and and save in it HTTP Session. Though I missed a lot of use cases here (do not want question length to increase a lot), but most of the flow will be same as I described in above steps. My flow will start from the Controller and will go towards Service Layer and than ends up in the DAO layer. Since there will be a lot of use cases where I need to check HTTP session and based on that need to call Service layer, I was planning to add a Facade layer which should be responsible to do this for me like checking Session and interacting with Service layer. Please suggest if this is a valid approach or any other best approach can be implemented here? One more point where I am confused is how to handle HTTP session in facade layer? do I need to pass HTTP session object each time I call my Facade or any other approach can be used here?

    Read the article

  • links for 2011-02-28

    - by Bob Rhubart
    Apache Tuscany : SCA Java 2.x Releases (tags: ping.fm) Richard Veryard on Architecture: Modernism and Enterprise Architecture "Underlying conventional enterprise architecture theory and practice are some implicit assumptions that could be loosely characterized as modernist. Several people are offering more or less radical departures from conventional enterprise architecture..." - Richard Veryard (tags: ping.fm entarch) Java / Oracle SOA blog: Building an asynchronous web service with OSB "A few weeks ago I made a blogpost over how you can build an asynchronous web service with JAX-WS. In this blogpost I will do the same in the Oracle Service Bus." - Oracle ACE Edwin Biemond (tags: oracle otn oracleace servicebus esb osb webservices soa) Enterprise Software Development with Java: GlassFish 3.1 arrived! Yes sir, we do cluster now! "GlassFish 3.1 is finally there. As promised by Oracle back in March last year! And it is an exciting release. It brings back all the clustering and high availability support we were missing since 2.x into the Java EE 6 world." - Oracle ACE Director Markus Eisele (tags: oracle otn oracleace glassfish)

    Read the article

  • Can a Shadow Copy of SQL 2000 databases files be used as a restore?

    - by Keith Sirmons
    Howdy, I have a SQL 2000 instance (version 8.00.760) that is on a drive that gets regular shadow copies. Can a shadow copy be used to restore the database? It seems possible to stop the SQL service, restore the Data folder from the shadow copy (includes msdb, master, model, temp, and the user databases, then restart the service. Would the files be in a crash consistent case in the worst case? If so, when restarting the service wouldn't it recover as if the power were pulled from the server? Thank you, Keith

    Read the article

  • Supporting HR Transformation with HelpDesk for Human Resources

    - by Robert Story
    Upcoming WebcastTitle: Supporting HR Transformation with HelpDesk for Human ResourcesDate: May 13, 2010 Time: 9:00 am PDT, 10 am MDT, 17:00 GMT Product Family: PeopleSoft HCM & EBS HRMS Summary HR transformation is a strategic initiative at many companies where world-class employee HR service delivery and a reduction of HR operating costs are top priorities. Having a centralized service delivery model and providing employees with tools to better help themselves can be very key to this initiative. This session shares how Oracle's PeopleSoft HelpDesk for Human Resources provides the technology foundation and best practices for this transformation. HelpDesk for Human Resources now integrates with both PeopleSoft HCM and E-Business Suite HRMS. This one-hour session is recommended for technical and functional users who want to understand what is new in PeopleSoft Help Desk for Human Resources 9.1 and how it benefits both PeopleSoft HCM and E-Business Suite HRMS customers. Topics will include: Understand the latest features and functionality Gain insight into future product direction Plan for implementation or upgrade of this module in your current system A short, live demonstration (only if applicable) and question and answer period will be included. Click here to register for this session....... ....... ....... ....... ....... ....... .......The above webcast is a service of the E-Business Suite Communities in My Oracle Support.For more information on other webcasts, please reference the Oracle Advisor Webcast Schedule.Click here to visit the E-Business Communities in My Oracle Support Note that all links require access to My Oracle Support.

    Read the article

  • Connecting the Dots (.NET Business Connector)

    - by ssmantha
    Recently, one of my colleagues was experimenting with Reporting Server on DAX 2009, whenever he used to view a report in SQL Server Reporting Manager he was welcomed with an error: “Error during processing Ax_CompanyName report parameter. (rsReportParameterProcessingError)” The Event Log had the following entry: Dynamics Adapter LogonAs failed. Microsoft.Dynamics.Framework.BusinessConnector.Session.Exceptions.FatalSessionException at Microsoft.Dynamics.Framework.BusinessConnector.Session.DynamicsSession.HandleException(Stringmessage, Exception exception, HandleExceptionCallback callback) We later found out that this was due to incorrect Business Connector account, with my past experience I noticed this as a very common mistake people make during EP and Reporting Installations. Remember that the reports need to connect to the Dynamics Ax server to run the AxQueries., which needs to pass through the .NET Business Connector. To ensure everything works fine please note the following settings: 1) Your Report Server Service Account should be same as .NET Business Connector proxy account. 2) Ensure on the server which has Reporting Services installed, the client configuration utility for Business Connector points to correct proxy account. 3) And finally, the AX instance you are connecting to has Service account specified for .NET business connector. (administration –> Service accounts –> .NET Business Connector) These simple checkpoints can help in almost most of the Business Connector related  errors, which I believe is mostly due to incorrect configuration settings. Happy DAXing!!

    Read the article

  • Update KB951847 (3.5 SP1 and Family Update) fails on Windows Server 2003 repeatedly

    - by Ducain
    We have a Win2K3 server that is perpetually stuck on the following update: Microsoft .NET Framework 3.5 Service Pack 1 and .NET Framework 3.5 Family Update for .NET versions 2.0 through 3.5 (KB951847) x86 It fails every time, and I'm at a loss on what to do about it. WHAT I'VE TRIED: Saw an article that said to turn off the update service, rename C:\Windows\SoftareDistribution\ and then restart the update service. Did this, ran the update, and it failed on this same one. I tried to run the update manually at C:\Windows\SoftwareDistribution\Downloads\Install\ and again it fails. Says, "None of the products that are addressed by this software update are installed on this computer. Click Cancel to exit setup." Because of this issue, none of the previous updates will work either. I will give one large pink-glazed donut with sprinkles to anyone who can help me slay this beast.

    Read the article

  • Plugin 'InnoDB' registration as a STORAGE ENGINE failed. On win 7

    - by NimChimpsky
    I have had to reinstall MySQL, however the service is failing to start with the above cause listed in evnt viewer. One solution is apparently to delete a couple of files prefixed with 'ib_logfile' which represent any old databases. However I do not have these files, and my service is still failing to start ... ? When I say I don't have these files I did a search using the windows search with zero results, and they are definitely not present in my mysql install directory. And I don't have the "documents and setting/appilcation data' folder referenced in link. In fact I only have only one mysql install directory, I know where that is - what do I need to delete/change ? The instance is configured OK, I ran that as administrator and it is listed in services, but the service itself fails to start Any tips, other than going over to postgresql ?

    Read the article

  • links for 2010-06-02

    - by Bob Rhubart
    @eelzinga: Oracle Service Bus 11g communication with Oracle SOA Suite 11g, DirectBindings, part1 Oracle ACE Erikc Elzinga launches a series of post in which he will describe how to develop various  Oracle Service Bus 11g to Oracle SOA Suite  process flows. (tags: oracle otn oracleace soa servicebus) @Atul_Kumar: Integrate UCM (ECM/Content Server) with Microsoft Active Directory as LDAP Provider Atul Kumar's step-by-step instructions. (tags: oracle otn enterprise2.0 ucm ecm ldap) Stefan Hinker: Is my application a good fit for CMT? "The first and most important criterion for suitability is always the service time of your application," says Stefan Hinker.  "If this is sufficient, then the application is OK on CMT. If it is not, and the reason is actually the CPU and not some other high-latency component (like a remote database), you will need to test on other CPU architectures." (tags: oracle sun cpu cmt sparc solaris) @deltalounge: Definitions of Services and Processes Peter Paul shares a collection of useful definitions gathered from the works of many of the big thinkers in the SOA space.  (tags: oracle otn soa businessprocess) OTN TechCast: Oracle Solaris Virtualization - Oracle Solaris Video Joost Pronk, CTO for Oracle Solaris Product Management, provides an overview of the robust virtualization functionality built into the Oracle Solaris OS. (tags: oracle otn solaris virtualization)

    Read the article

  • Print spooler consumes over 1GB of memory

    - by Stephen Jennings
    Suddenly, on a Windows Vista Business workstation I manage, the Windows print spooler service is consuming over 1GB of memory. I got the call this morning that the user could not print. I discovered all printers were missing from the Printers applet in Control Panel. I rebooted the machine, and at first the printers were still missing, but after a few minutes (and much banging my head against the wall) they suddenly appeared. I stopped worrying about it until later today it happened again to the same workstation. To my knowledge, nothing has changed on the computer. No new printers have been added, no new print drivers would have been installed, and no new software is being used. I tried clearing out the spooler folder (C:\Windows\System32\spooler\printers) which did have four print jobs from this morning, but the problem persists after restarting the spooler service. When starting the service, it starts out using 824 KB of memory, then after about 20 seconds it starts creeping up about 10MB each second until it stabilizes around 1.8GB.

    Read the article

  • Nagios - NagWin - Send notification with gmail

    - by Attila Bujáki
    I would like to send Nagios notifications using my gmail account. I have already set up my hosts I want to monitor and services also. What is the most simple way to accomplish this using NagWin on a Windows Server 2012 installation? As far as I know I must change some of these configuration settings: # 'notify-host-by-email' command definition define command{ command_name notify-host-by-email command_line /usr/bin/printf "%b" "***** Nagios *****\n\nNotification Type: $NOTIFICATIONTYPE$\nHost: $HOSTNAME$\nState: $HOSTSTATE$\nAddress: $HOSTADDRESS$\nInfo: $HOSTOUTPUT$\n\nDate/Time: $LONGDATETIME$\n" | /bin/blat - -to $CONTACTEMAIL$ -f nagios@localhost -subject "** $NOTIFICATIONTYPE$ Host Alert: $HOSTNAME$ is $HOSTSTATE$ **" -server ??? } # 'notify-service-by-email' command definition define command{ command_name notify-service-by-email command_line /usr/bin/printf "%b" "***** Nagios *****\n\nNotification Type: $NOTIFICATIONTYPE$\n\nService: $SERVICEDESC$\nHost: $HOSTALIAS$\nAddress: $HOSTADDRESS$\nState: $SERVICESTATE$\n\nDate/Time: $LONGDATETIME$\n\nAdditional Info:\n\n$SERVICEOUTPUT$\n" | /bin/blat - -to $CONTACTEMAIL$ -f nagios@localhost -subject "** $NOTIFICATIONTYPE$ Service Alert: $HOSTALIAS$/$SERVICEDESC$ is $SERVICESTATE$ **" -server ??? } What should I use for smtp server? Is it possible to directly send my notifications to the Gmail server?

    Read the article

  • Solaris: What comes next?

    - by alanc
    As you probably know by now, a few months ago, we released Solaris 11 after years of development. That of course means we now need to figure out what comes next - if Solaris 11 is “The First Cloud OS”, then what do we need to make future releases of Solaris be, to be modern and competitive when they're released? So we've been having planning and brainstorming meetings, and I've captured some notes here from just one of those we held a couple weeks ago with a number of the Silicon Valley based engineers. Now before someone sees an idea here and calls their product rep wanting to know what's up, please be warned what follows are rough ideas, and as I'll discuss later, none of them have any committment, schedule, working code, or even plan for integration in any possible future product at this time. (Please don't make me force you to read the full Oracle future product disclaimer here, you should know it by heart already from the front of every Oracle product slide deck.) To start with, we did some background research, looking at ideas from other Oracle groups, and competitive OS'es. We examined what was hot in the technology arena and where the interesting startups were heading. We then looked at Solaris to see where we could apply those ideas. Making Network Admins into Socially Networking Admins We all know an admin who has grumbled about being the only one stuck late at work to fix a problem on the server, or having to work the weekend alone to do scheduled maintenance. But admins are humans (at least most are), and crave companionship and community with their fellow humans. And even when they're alone in the server room, they're never far from a network connection, allowing access to the wide world of wonders on the Internet. Our solution here is not building a new social network - there's enough of those already, and Oracle even has its own Oracle Mix social network already. What we proposed is integrating Solaris features to help engage our system admins with these social networks, building community and bringing them recognition in the workplace, using achievement recognition systems as found in many popular gaming platforms. For instance, if you had a Facebook account, and a group of admin friends there, you could register it with our Social Network Utility For Facebook, and then your friends might see: Alan earned the achievement Critically Patched (April 2012) for patching all his servers. Matt is only at 50% - encourage him to complete this achievement today! To avoid any undue risk of advertising who has unpatched servers that are easier targets for hackers to break into, this information would be tightly protected via Facebook's world-renowned privacy settings to avoid it falling into the wrong hands. A related form of gamification we considered was replacing simple certfications with role-playing-game-style Experience Levels. Instead of just knowing an admin passed a test establishing a given level of competency, these would provide recruiters with a more detailed level of how much real-world experience an admin has. Achievements such as the one above would feed into it, but larger numbers of experience points would be gained by tougher or more critical tasks - such as recovering a down system, or migrating a service to a new platform. (As long as it was an Oracle platform of course - migrating to an HP or IBM platform would cause the admin to lose points with us.) Unfortunately, we couldn't figure out a good way to prevent (if you will) “gaming” the system. For instance, a disgruntled admin might decide to start ignoring warnings from FMA that a part is beginning to fail or skip preventative maintenance, in the hopes that they'd cause a catastrophic failure to earn more points for bolstering their resume as they look for a job elsewhere, and not worrying about the effect on your business of a mission critical server going down. More Z's for ZFS Our suggested new feature for ZFS was inspired by the worlds most successful Z-startup of all time: Zynga. Using the Social Network Utility For Facebook described above, we'd tie it in with ZFS monitoring to help you out when you find yourself in a jam needing more disk space than you have, and can't wait a month to get a purchase order through channels to buy more. Instead with the click of a button you could post to your group: Alan can't find any space in his server farm! Can you help? Friends could loan you some space on their connected servers for a few weeks, knowing that you'd return the favor when needed. ZFS would create a new filesystem for your use on their system, and securely share it with your system using Kerberized NFS. If none of your friends have space, then you could buy temporary use space in small increments at affordable rates right there in Facebook, using your Facebook credits, and then file an expense report later, after the urgent need has passed. Universal Single Sign On One thing all the engineers agreed on was that we still had far too many "Single" sign ons to deal with in our daily work. On the web, every web site used to have its own password database, forcing us to hope we could remember what login name was still available on each site when we signed up, and which unique password we came up with to avoid having to disclose our other passwords to a new site. In recent years, the web services world has finally been reducing the number of logins we have to manage, with many services allowing you to login using your identity from Google, Twitter or Facebook. So we proposed following their lead, introducing PAM modules for web services - no more would you have to type in whatever login name IT assigned and try to remember the password you chose the last time password aging forced you to change it - you'd simply choose which web service you wanted to authenticate against, and would login to your Solaris account upon reciept of a cookie from their identity service. Pinning notes to the cloud We also all noted that we all have our own pile of notes we keep in our daily work - in text files in our home directory, in notebooks we carry around, on white boards in offices and common areas, on sticky notes on our monitors, or on scraps of paper pinned to our bulletin boards. The contents of the notes vary, some are things just for us, some are useful for our groups, some we would share with the world. For instance, when our group moved to a new building a couple years ago, we had a white board in the hallway listing all the NIS & DNS servers, subnets, and other network configuration information we needed to set up our Solaris machines after the move. Similarly, as Solaris 11 was finishing and we were all learning the new network configuration commands, we shared notes in wikis and e-mails with our fellow engineers. Users may also remember one of the popular features of Sun's old BigAdmin site was a section for sharing scripts and tips such as these. Meanwhile, the online "pin board" at Pinterest is taking the web by storm. So we thought, why not mash those up to solve this problem? We proposed a new BigAddPin site where users could “pin” notes, command snippets, configuration information, and so on. For instance, once they had worked out the ideal Automated Installation manifest for their app server, they could pin it up to share with the rest of their group, or choose to make it public as an example for the world. Localized data, such as our group's notes on the servers for our subnet, could be shared only to users connecting from that subnet. And notes that they didn't want others to see at all could be marked private, such as the list of phone numbers to call for late night pizza delivery to the machine room, the birthdays and anniversaries they can never remember but would be sleeping on the couch if they forgot, or the list of automatically generated completely random, impossible to remember root passwords to all their servers. For greater integration with Solaris, we'd put support right into the command shells — redirect output to a pinned note, set your path to include pinned notes as scripts you can run, or bring up your recent shell history and pin a set of commands to save for the next time you need to remember how to do that operation. Location service for Solaris servers A longer term plan would involve convincing the hardware design groups to put GPS locators with wireless transmitters in future server designs. This would help both admins and service personnel trying to find servers in todays massive data centers, and could feed into location presence apps to help show potential customers that while they may not see many Solaris machines on the desktop any more, they are all around. For instance, while walking down Wall Street it might show “There are over 2000 Solaris computers in this block.” [Note: this proposal was made before the recent media coverage of a location service aggregrator app with less noble intentions, and in hindsight, we failed to consider what happens when such data similarly falls into the wrong hands. We certainly wouldn't want our app to be misinterpreted as “There are over $20 million dollars of SPARC servers in this building, waiting for you to steal them.” so it's probably best it was rejected.] Harnessing the power of the GPU for Security Most modern OS'es make use of the widespread availability of high powered GPU hardware in today's computers, with desktop environments requiring 3-D graphics acceleration, whether in Ubuntu Unity, GNOME Shell on Fedora, or Aero Glass on Windows, but we haven't yet made Solaris fully take advantage of this, beyond our basic offering of Compiz on the desktop. Meanwhile, more businesses are interested in increasing security by using biometric authentication, but must also comply with laws in many countries preventing discrimination against employees with physical limations such as missing eyes or fingers, not to mention the lost productivity when employees can't login due to tinted contacts throwing off a retina scan or a paper cut changing their fingerprint appearance until it heals. Fortunately, the two groups considering these problems put their heads together and found a common solution, using 3D technology to enable authentication using the one body part all users are guaranteed to have - pam_phrenology.so, a new PAM module that uses an array USB attached web cams (or just one if the user is willing to spin their chair during login) to take pictures of the users head from all angles, create a 3D model and compare it to the one in the authentication database. While Mythbusters has shown how easy it can be to fool common fingerprint scanners, we have not yet seen any evidence that people can impersonate the shape of another user's cranium, no matter how long they spend beating their head against the wall to reshape it. This could possibly be extended to group users, using modern versions of some of the older phrenological studies, such as giving all users with long grey beards access to the System Architect role, or automatically placing users with pointy spikes in their hair into an easy use mode. Unfortunately, there are still some unsolved technical challenges we haven't figured out how to overcome. Currently, a visit to the hair salon causes your existing authentication to expire, and some users have found that shaving their heads is the only way to avoid bad hair days becoming bad login days. Reaction to these ideas After gathering all our notes on these ideas from the engineering brainstorming meeting, we took them in to present to our management. Unfortunately, most of their reaction cannot be printed here, and they chose not to accept any of these ideas as they were, but they did have some feedback for us to consider as they sent us back to the drawing board. They strongly suggested our ideas would be better presented if we weren't trying to decipher ink blotches that had been smeared by the condensation when we put our pint glasses on the napkins we were taking notes on, and to that end let us know they would not be approving any more engineering offsites in Irish themed pubs on the Friday of a Saint Patrick's Day weekend. (Hopefully they mean that situation specifically and aren't going to deny the funding for travel to this year's X.Org Developer's Conference just because it happens to be in Bavaria and ending on the Friday of the weekend Oktoberfest starts.) They recommended our research techniques could be improved over just sitting around reading blogs and checking our Facebook, Twitter, and Pinterest accounts, such as considering input from alternate viewpoints on topics such as gamification. They also mentioned that Oracle hadn't fully adopted some of Sun's common practices and we might have to try harder to get those to be accepted now that we are one unified company. So as I said at the beginning, don't pester your sales rep just yet for any of these, since they didn't get approved, but if you have better ideas, pass them on and maybe they'll get into our next batch of planning.

    Read the article

  • Partner Webcast – Implementing Web Services & SOA Security with Oracle Fusion Middleware - 20 September 2012

    - by Thanos
    Security was always one of the main pain points for the IT industry, and new security challenges has been introduced with the proliferation  of the service-oriented approach to building modern software. Oracle Fusion Middleware provides a wide variety of features that ease the building service-oriented solutions, but how these services can be secured?Should we implement the security features in each and every service or there’s a better way? During the webinar we are going to show how to implement non-intrusive declarative security for your SOA components by introducing the Oracle product portfolio in this area, such as Oracle Web Services Manager and Oracle IDM. Agenda: SOA & Web Services basics: quick refresher Building your SOA with Oracle Fusion Middleware: product review Common security risks in the Web Services world SOA & Web Services security standards Implementing Web Services Security with the Oracle products Web Services Security with Oracle – the big picture Declarative end point security with Oracle Web Services Manager Perimeter Security with Oracle Enterprise Gateway Utilizing the other Oracle IDM products for the advanced scenarios Q&A session Delivery Format This FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Thursday, September 20, 2012 - 10:00 AM to 11:00 AM CET (GMT/UTC+1)Duration: 1 hour Register Now Send your questions and migration/upgrade requests [email protected] Visit regularly our ISV Migration Center blog or Follow us @oracleimc to learn more on Oracle Technologies, upcoming partner webcasts and events. All content is made available through our YouTube - SlideShare - Oracle Mix.

    Read the article

  • Implementing SOA & Security with Oracle Fusion Middleware in your solution – partner webcast September 20th 2012

    - by JuergenKress
    Security was always one of the main pain points for the IT industry, and new security challenges has been introduced with the proliferation  of the service-oriented approach to building modern software. Oracle Fusion Middleware provides a wide variety of features that ease the building service-oriented solutions, but how these services can be secured? Should we implement the security features in each and every service or there’s a better way? During the webinar we are going to show how to implement non-intrusive declarative security for your SOA components by introducing the Oracle product portfolio in this area, such as Oracle Web Services Manager and Oracle Enterprise Gateway. Agenda: SOA & Web Services basics: quick refresher Building your SOA with Oracle Fusion Middleware: product review Common security risks in the Web Services world SOA & Web Services security standards Implementing Web Services Security with the Oracle products Web Services Security with Oracle – the big picture Declarative end point security with Oracle Web Services Manager Perimeter Security with Oracle Enterprise Gateway Utilizing the other Oracle IDM products for the advanced scenarios Q&A session Delivery Format This FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Duration: 1 hour Register Now Send your questions and migration/upgrade requests [email protected] Visit regularly our ISV Migration Center blog or Follow us @oracleimc to learn more on Oracle Technologies, upcoming partner webcasts and events. All content is made available through our YouTube - SlideShare - Oracle Mix. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: Technorati Tags: ISV migration center,SOA,IDM,SOA Community,Oracle SOA,Oracle BPM,BPM,Community,OPN,Jürgen Kress

    Read the article

  • How to Search for (and Find) Solaris Docs

    - by rickramsey
    Just the other day, I went to the recently-released Oracle Solaris 11 library to search for information about the print service changes. I knew there had been changes in Oracle Solaris 11, but could not remember the new approach to printing. So, being the optimist that I (never) am, I went to the Oracle Solaris 11 Information Library on docs.oracle.com and typed "print service" into the search box. Imagine my surprise when the response back was: We did not find any search results for: print service site:download.oracle.com url:/docs/cd/E23824_01. OMG! WTF? Are you kidding me? After throwing a few stuffed animals at my computer screen, I tried again. Is search broken? Well, sort of (and I'm trying to get it fixed). In the meantime, however, there is a reasonably simple user workaround. Possibly unnoticed by most people, there is a Within drop-down menu on the Oracle search results page. If you simply open the Within menu, select Documentation, and click the little magnifying glass again, you (should) get the expected results. Is it perfect? No, but at least it's an improvement over being completely broken. - Janice Critchlow, Information Architect, Systems Website Newsletter Facebook Twitter

    Read the article

  • How to structure an application that combines WCF and WPF

    - by CiaranG
    I'm in the process of learning how to use WCF (Windows Communication Foundation) to allow a client/server desktop application to communicate. The application's UI will be implemented using WPF, and we will probably use SQL Server for our database. What I'm struggling with, is understanding how to structure such an application. From what I've read, there are three components of a WCF application (which in the examples I've seen have existed as separate projects): A WCF service A WCF service host A WCF service client My question then, is - should these projects solely implement the functionality of sending/receiving data from the client/server? Would it make better sense this way? Would it make sense to create a separate WPF (Windows Presentation Foundation) project to implement the UI for the application? And so, when I need to send/receive data from the client/server, I could simply invoke the operations provided in the WCF projects that I have created? For anyone who has built similar applications previously, perhaps you could explain what worked best for you in terms of structuring your application? For example, if I create a user registration page. When the user clicks the 'Register' button, the client application will need to send the data to the server. In this case, could I just invoke the methods provided in the WCF projects to send the data? Also, what data structures worked best for you when sending/receiving data? My initial thought is sending/receiving XML containing the data. Is this an option that is easy to implement? I realise that answers to this question may well be a matter of opinion - unless there are specific best practices that I'm not aware of. Thank you

    Read the article

  • Google Updates Picasa Web Albums; Emphasis on Sharing and Showcasing

    - by ETC
    Google has dusted off the Picasa Web interface and updated it with an emphasis on highlighting your photos and the photos of those you’re interested in. The new interface gives you speedy access to all the new photos you’ve uploaded and all the photos your friends, family, and others you’re following are sharing. Mixed in with that are popular photos from talented photographers across the service. It’s a nice change from the previously dull web interface and a definite step towards capturing some of the social power photo sharing site Flickr wields. Hit up the link below to read more. Showcasing Photos From People You Care About [The Official Google Photos Blog] Latest Features How-To Geek ETC Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions Add a “Textmate Style” Lightweight Text Editor with Dropbox Syncing to Chrome and Iron Is the Forcefield Really On or Not? [Star Wars Parody Video] Google Updates Picasa Web Albums; Emphasis on Sharing and Showcasing Uwall.tv Turns YouTube into a Video Jukebox Early Morning Sunrise at the Beach Wallpaper Data Networks Visualized via Light Paintings [Video]

    Read the article

  • Is there a security risk for allowing people to set their DNS so their own subdomains can be route to my server?

    - by DantheMan
    Lets say that I have a web application, built in Django and deployed with Nginx. Is it a good idea to offer a service that allows customers to request that a subdomain can be pointed at it. I figured this: If I dont allow this, then some companies wont want to access the service from http://mydjangoappmadeupname.com/bigcorporation/ They would rather access it through http://service.bigcorporation.com That would effectively mask that they are using an outside resource. Is there a significant risk that I am overlooking? Also do you think it would be easier to just set things up in Django to handle it, allowing Nginx to accept all domains and then pushing them to Django which would filter out if they are allowed or not, or would it be better to just update my Nginx log each time a client wanted this changed?

    Read the article

  • WCF Versioning, Naming and Endpoint URL

    - by Vinothkumar VJ
    I have a WCF Service and a Main Lib1. Say, I have a Save Profile Service. WCF gets data (with predefined data contract) from client and pass the same to the Main Class Lib1, generate response and send it back to client. WCF Method : SaveProfile(ProfileDTO profile) Current Version 1.0 ProfileDTO have the following UserName Password FirstName DOB (In string yyyy-mm-dd) CreatedDate (In string yyyy-mm-dd) Next Version (V2.0) ProfileDTO have the following UserName Password FirstName DOB (In UnixTimeStamp) CreatedDate (In UnixTimeStamp) Version 3.0 ProfileDTO have the following (With change in UserName and Password length validation) UserName Password FirstName DOB (In UnixTimeStamp) CreatedDate (In UnixTimeStamp) In simple we have DataContract and Workflow change between each version 1. How do I name the methods in WCF Service and Main Class Lib1? 2. Do I have to go with any specific pattern for ease development and maintenance? 3. Do I have to have different endpoints for different version? In the above example I have a method named “SaveProfile”. Do I have to name the methods like “SaveProfile1.0”, “SaveProfile2.0”, etc. If that is the case when there is no change between Version “3.0” and “4.0” then there will difficult in maintenance. I’m looking for a approach that will help in ease maintenance

    Read the article

  • New cloud development workflow using Github, Cloud9ide and CloudFoundry.

    - by weng
    So time is changing towards cloud development/computing. I'm trying to get the new "cloud" workflow based on the services I'm going to use: Github, Cloud9ide and CloudFoundry. Here is what is on my mind: Github acts like a central (main repo) just like yesterday's local filesystem. Every service will base it service upon this main repo. Workflow: Github: I create a new Github repo served as main repo for the project. Cloud9ide. I open my Github repo and write my tests and implementation (BDD/TDD). When I'm ready I save (commit) it to main repo on Github. X: A running instance of Jenkins detects someone has committed and fetches the latest commit, builds, deploys, tests (yeti and/or selenium) and reports if the tests were passed or not. If not, I make another commit til all tests are passing. X: I run the CloudFoundry commands to push the main Github repo to CloudFoundry's server and it will deploy my app automatically. What I'm still confused about is where this X environment will be. On a local server where I have to install Jenkins? Or could I install it on Cloud9ide (when java is supported) or will it be on another cloud service? Also, that X environment has to be able to fetch (clone) the Github repo and run the build scripts. And since the concept of Cloud9ide is very new and there haven't been any other predecessors I really wonder how the workflow will look like. We all know Github's workflow. We now know CloudFoundry's workflow (deploy/scale with a restful API/command line tool). But how Cloud9Ide will operate is still somewhat unclear to me. Someone on Cloud9ide mentioned that there will be buttons like deploy so I can deploy with one click. But that I guess will depend on what services that deploy process will hook up into etc. Could someone enlighten this cloud workflow topic and fill in the gaps. Thanks.

    Read the article

  • Row Number Transformation

    The Row Number Transformation calculates a row number for each row, and adds this as a new output column to the data flow. The column number is a sequential number, based on a seed value. Each row receives the next number in the sequence, based on the defined increment value. The final row number can be stored in a variable for later analysis, and can be used as part of a process to validate the integrity of the data movement. The Row Number transform has a variety of uses, such as generating surrogate keys, or as the basis for a data partitioning scheme when combined with the Conditional Split transformation. Properties Property Data Type Description Seed Int32 The first row number or seed value. Increment Int32 The value added to the previous row number to make the next row number. OutputVariable String The name of the variable into which the final row number is written post execution. (Optional). The three properties have been configured to support expressions, or they can set directly in the normal manner. Expressions on components are only visible on the hosting Data Flow task, not at the individual component level. Sometimes the data type of the property is incorrectly set when the properties are created, see the Troubleshooting section below for details on how to fix this. Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. For 2005/2008 Only - Finally you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Row Number transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations, and this component requires a minimum of SQL Server 2005 Service Pack 1. Downloads The Row Number Transformation  is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Row Number Transformation for SQL Server 2005 Row Number Transformation for SQL Server 2008 Row Number Transformation for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.6 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.5 - SQL Server 2008 release. (15 Oct 2008) SQL Server 2005 Version 1.2.0.34 – Updated installer. (25 Jun 2008) Version 1.2.0.7 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. Added the ability to reuse an existing column to hold the generated row number, as an alternative to the default of adding a new column to the output. (18 Jun 2006) Version 1.2.0.7 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. Added the ability to reuse an existing column to hold the generated row number, as an alternative to the default of adding a new column to the output. (18 Jun 2006) Version 1.0.0.0 - Public Release for SQL Server 2005 IDW 15 June CTP (29 Aug 2005) Screenshot Code Sample The following code sample demonstrates using the Data Generator Source and Row Number Transformation programmatically in a very simple package. Package package = new Package(); package.Name = "Data Generator & Row Number"; // Add the Data Flow Task Executable taskExecutable = package.Executables.Add("STOCK:PipelineTask"); // Get the task host wrapper, and the Data Flow task TaskHost taskHost = taskExecutable as TaskHost; MainPipe dataFlowTask = (MainPipe)taskHost.InnerObject; // Add Data Generator Source IDTSComponentMetaData100 componentSource = dataFlowTask.ComponentMetaDataCollection.New(); componentSource.Name = "Data Generator"; componentSource.ComponentClassID = "Konesans.Dts.Pipeline.DataGenerator.DataGenerator, Konesans.Dts.Pipeline.DataGenerator, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b"; CManagedComponentWrapper instanceSource = componentSource.Instantiate(); instanceSource.ProvideComponentProperties(); instanceSource.SetComponentProperty("RowCount", 10000); // Add Row Number Tx IDTSComponentMetaData100 componentRowNumber = dataFlowTask.ComponentMetaDataCollection.New(); componentRowNumber.Name = "FlatFileDestination"; componentRowNumber.ComponentClassID = "Konesans.Dts.Pipeline.RowNumberTransform.RowNumberTransform, Konesans.Dts.Pipeline.RowNumberTransform, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b"; CManagedComponentWrapper instanceRowNumber = componentRowNumber.Instantiate(); instanceRowNumber.ProvideComponentProperties(); instanceRowNumber.SetComponentProperty("Increment", 10); // Connect the two components together IDTSPath100 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(componentSource.OutputCollection[0], componentRowNumber.InputCollection[0]); #if DEBUG // Save package to disk, DEBUG only new Application().SaveToXml(String.Format(@"C:\Temp\{0}.dtsx", package.Name), package, null); #endif package.Execute(); foreach (DtsError error in package.Errors) { Console.WriteLine("ErrorCode : {0}", error.ErrorCode); Console.WriteLine(" SubComponent : {0}", error.SubComponent); Console.WriteLine(" Description : {0}", error.Description); } package.Dispose(); Troubleshooting Make sure you have downloaded the version that matches your version of SQL Server. We offer separate downloads for SQL Server 2005, SQL Server 2008 and SQL Server 2012. If you get an error when you try and use the component along the lines of The component could not be added to the Data Flow task. Please verify that this component is properly installed.  ... The data flow object "Konesans ..." is not installed correctly on this computer, this usually indicates that the internal cache of SSIS components needs to be updated. This is held by the SSIS service, so you need restart the the SQL Server Integration Services service. You can do this from the Services applet in Control Panel or Administrative Tools in Windows. You can also restart the computer if you prefer. You may also need to restart any current instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Once installation is complete you need to manually add the task to the toolbox before you will see it and to be able add it to packages - How do I install a task or transform component? Please also make sure you have installed a minimum of SP1 for SQL 2005. The IDtsPipelineEnvironmentService was added in SQL Server 2005 Service Pack 1 (SP1) (See  http://support.microsoft.com/kb/916940). If you get an error Could not load type 'Microsoft.SqlServer.Dts.Design.IDtsPipelineEnvironmentService' from assembly 'Microsoft.SqlServer.Dts.Design, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'. when trying to open the user interface, it implies that your development machine has not had SP1 applied. Very occasionally we get a problem to do with the properties not being created with the correct data type. Since there is no way to programmatically to define the data type of a pipeline component property, it can only infer it. Whilst we set an integer value as we create the property, sometimes SSIS decides to define it is a decimal. This is often highlighted when you use a property expression against the property and get an error similar to Cannot convert System.Int32 to System.Decimal. Unfortunately this is beyond our control and there appears to be no pattern as to when this happens. If you do have more information we would be happy to hear it. To fix this issue you can manually edit the package file. In Visual Studio right click the package file from the Solution Explorer and select View Code, which will open the package as raw XML. You can now search for the properties by name or the component name. You can then change the incorrect property data types highlighted below from Decimal to Int32. <component id="37" name="Row Number Transformation" componentClassID="{BF01D463-7089-41EE-8F05-0A6DC17CE633}" … >     <properties>         <property id="38" name="UserComponentTypeName" …>         <property id="41" name="Seed" dataType="System.Int32" ...>10</property>         <property id="42" name="Increment" dataType="System.Decimal" ...>10</property>         ... If you are still having issues then contact us, but please provide as much detail as possible about error, as well as which version of the the task you are using and details of the SSIS tools installed.

    Read the article

  • Email notification and mail server

    - by Jerr Wu
    I am building a web application with email notification just like Facebook, which will host in http://www.linode.com/. When a user A comment to a post, the poster will get an email notification from '[email protected]' with the comment message written by user A. (Not spam) I really like Google Apps but they have sending limits 2000 sending per day, that is not suit for my case becuz I cannot have sending limits. There will be many email notifications. http://support.google.com/a/bin/answer.py?hl=en&answer=166852 I also need company email accounts for team members use which I prefer Google Apps. My web application will host in linode, I am considering "Amazon Simple Notification Service" for the email notification. My questions are Any other recommend email service provider suits my case for me? Can I bind company email accounts(ex: [email protected]) with Google Apps and bind [email protected] with other email service provider?

    Read the article

< Previous Page | 232 233 234 235 236 237 238 239 240 241 242 243  | Next Page >