Search Results

Search found 27858 results on 1115 pages for 'managed service provider'.

Page 246/1115 | < Previous Page | 242 243 244 245 246 247 248 249 250 251 252 253  | Next Page >

  • 12.04 - How do I Fix Grub Error 15 on New Dual Boot Install

    - by Garth
    I just installed 12.04 to Dual Boot (separate partitions) with an existing Win 7. Upon reboot after install things freeze after Grub 1.5 with a Grub Error 15 message. Is there any easy way to fix this? (I am posting this from my second computer) UPDATE: I managed to boot into both 12.04 and Win7 using BIOS: Selected the disk with the Win7 'C' Partition: resulted in the same error message Rebooted, tried the disk with the Ubuntu Partitions: *Grub Menu loaded: Managed to boot 12.04, rebooted, used BIOS again: Managed to boot Win 7 So, I have access to my computer again (thru BIOS), but this has been a pretty crappy install experience. Garth I used the the Final release 12.04 Ubuntu install disk, reformatted all Linux partitions, and expected a simple clean install. Other than specifying the Ubuntu Partitions, I did a basic install of 12.04. No way I did do anything to get this crap error failure! I have no idea why my install resulted in a Grub-15 error. CLOSED - Answered my own Question: I burned a RescuTux Disk and used it to recover grub2 (simplest and easiest way for me. http://www.supergrubdisk.org/category/download/rescatuxdownloads/ Garth

    Read the article

  • Best ways to collect location-based user input

    - by user359650
    I'm working on a website where users will be able to register and provide information about their location. In order to prevent users from inputting incorrect data, we don't want users to provide free-text information but instead choose from predefined values as much as possible. We believe there are 2 ways of providing those values: use an API to an external service provider or create your own local database. APIs Some resources: - https://developers.facebook.com/docs/reference/ads-api/get-autocomplete-data/ - http://developer.yahoo.com/geo/geoplanet/ Pros: -accuracy and completeness of data. -no maintenance related to update of data as this it taken care of by API provider. -easier/faster to get started (no need to create local database, just implement API). Cons: -degradation of performance when availability issues with external API. -outage due to changes to the external API (until your code is updated to reflect those changes). -lock-in with external provider. Local database Some resources: - http://developer.yahoo.com/geo/geoplanet/data/ - http://www.maxmind.com/app/geolitecity - http://download.geonames.org/export/dump/ Pros: -no external dependency: improved stability and performance. Cons: -more work to get started (you need to create the database and code to interact with it). -risks of inaccurate/incomplete data, either initially or over time. -more maintenance work to keep database up to date. Assuming the depth information requested from users is as follows: -country: interested in value. also used to narrow down list of regions. -region (state in the US, county in the UK...): not interested in value itself, only used to narrow down list of cities. -city: interested in value (which can be used to work out related region should we need regional statistics). -address: interested in value although OPTIONAL. Which option (whether API or local database) would you choose? What tips you would give for the implementation? What other resources can you share?

    Read the article

  • Working with QuickBooks using LINQPad

    - by dataintegration
    The RSSBus ADO.NET Providers can be used from many applications and development environments. In this article, we show how to use LINQPad to connect to QuickBooks using the RSSBus ADO.NET Provider for QuickBooks. Although this example uses the QuickBooks Data Provider, the same process applies to any of our ADO.NET Providers. Create the Data Model Step 1: Download and install both the Data Provider from RSSBus and LINQPad (available at www.linqpad.net Step 2: Create a new project in Visual Studio and create a data model for it using the ADO.NET Entity Data Model wizard. Step 3: Create a new connection by clicking "New Connection", specify the connection string options, and click Next. Step 4: Select the desired tables and views and click Finish to create the data model. Step 5: Right click on the entity diagram and select 'Add Code Generation Item'. Choose the 'ADO.NET DbContext Generator'. Step 6: Now build the project. The generated files can be used to create a QuickBooks connection in LINQPad. Create the connection to QuickBooks in LINQPad Step 7:Open LINQPad and click 'Add New Connection'. Step 8: Choose 'Entity Framework DbContext POCO'. Step 9: Choose the data model assembly ('.dll') created by Visual Studio as the 'Path to Custom Assembly'. Choose the name of the custom DbContext, the path to the config file, and assign a name to the connection that will allow you to recognize its purpose. Step 10: Congratulations! Now you have a connection to QuickBooks, and you can query data through LINQPad.

    Read the article

  • Hijacking ASP.NET Sessions

    - by Ricardo Peres
    So, you want to be able to access other user’s session state from the session id, right? Well, I don’t know if you should, but you definitely can do that! Here is an extension method for that purpose. It uses a bit of reflection, which means, it may not work with future versions of .NET (I tested it with .NET 4.0/4.5). 1: public static class HttpApplicationExtensions 2: { 3: private static readonly FieldInfo storeField = typeof(SessionStateModule).GetField("_store", BindingFlags.NonPublic | BindingFlags.Instance); 4:  5: public static ISessionStateItemCollection GetSessionById(this HttpApplication app, String sessionId) 6: { 7: var module = app.Modules["Session"] as SessionStateModule; 8:  9: if (module == null) 10: { 11: return (null); 12: } 13:  14: var provider = storeField.GetValue(module) as SessionStateStoreProviderBase; 15:  16: if (provider == null) 17: { 18: return (null); 19: } 20:  21: Boolean locked; 22: TimeSpan lockAge; 23: Object lockId; 24: SessionStateActions actions; 25:  26: var data = provider.GetItem(HttpContext.Current, sessionId.Trim(), out locked, out lockAge, out lockId, out actions); 27:  28: if (data == null) 29: { 30: return (null); 31: } 32:  33: return (data.Items); 34: } 35: } As you can see, it extends the HttpApplication class, that is because we need to access the modules collection, for the Session module. Use with care!

    Read the article

  • Removing spam external links after pharma hack?

    - by Beatchef
    Back in February my work's site was attacked by a pharma hack at the shared hosting end. I managed to find the placed file and the reference to run it in one of our files. I deleted this file, deleted and redownloaded all of the plugins and themes and reinstalled Wordpress. However I could never find the database entries no matter what I have read up on. Searching for known entries or for drug names backwards etc. On the Google and Bing end I have managed to deny and delete the entries and cache of most if not all of the bad links that the hack managed to instantly SEO to death (why don't these guys work legit and make more money?) However the one thing that is remaining is external links on the homepage that are invisible except when the site is viewed in google cache or scanned with unmaskparasites.com (and says that the external links are safe even though they're obviously not!). http://www.UnmaskParasites.com/security-report/?page=kmcharityteam.co.uk All sorts of website scans say there's nothing wrong with it and I can't find the source of the links in the header or footer or anywhere in the theme. I've searched for the links in the database but no use there either and they change every day so really I'd have to be looking for a generator? Does anybody have any advice or a solution for removing these links? Thanks!

    Read the article

  • Notification framework for object lifecycle

    - by rlandster
    I am looking for an application, framework, or library that would help us with "object life-cycle management". There are many things that are created for users, departments, and services that, all too often, are left unmanaged. Some examples: user accounts groups SSL certificates access rights databases software license provisionings storage list-serve accounts These objects are created and managed by a wide variety of applications and systems. Typically, a user (person) requests (either explicitly or implicitly) one of these objects. A centralized management tool would help us manage such administration chores as: What objects does user X currently own/manage? Move the ownership of object P to user X; move all objects owned by user X (who was just been fired) to user Y. For all objects of type T that have expired be sure the objects have been disabled or deleted by their provider. How many active (expired, about-to-expire) objects of type P are there? Send periodic notifications to all users who own active objects of type P reminding them of what they own. There is a security alert for objects of type P; send a notification to all users who own these types of objects to take a specific remedial action. Delete or disable a set of objects based on expiration (or some other criteria). These objects are directly managed through their own applications (Active Directory, MySql, file systems, etc.) and may even have their own notification systems, but I want to centralize this into an "object management system". The OMS should allow the association with an external identity provider that defines who the users and groups are (e.g., LDAP, Active Directory) creation of objects association of an object to a specific user and/or group association with an expiration date creation of flexible reporting including letting users know what objects they currently own and their expiration dates integration with an external object "provider" via a plug-in We could write something from scratch, but I am hoping there is something already out there that will help, either an entire application or a set of libraries that provide much of what is needed. Any ideas?

    Read the article

  • Convert collections of enums to collection of strings and vice versa

    - by Michael Freidgeim
    Recently I needed to convert collections of  strings, that represent enum names, to collection of enums, and opposite,  to convert collections of   enums  to collection of  strings. I didn’t find standard LINQ extensions.However, in our big collection of helper extensions I found what I needed - just with different names: /// <summary> /// Safe conversion, ignore any unexpected strings/// Consider to name as Convert extension /// </summary> /// <typeparam name="EnumType"></typeparam> /// <param name="stringsList"></param> /// <returns></returns> public static List<EnumType> StringsListAsEnumList<EnumType>(this List<string> stringsList) where EnumType : struct, IComparable, IConvertible, IFormattable     { List<EnumType> enumsList = new List<EnumType>(); foreach (string sProvider in stringsList)     {     EnumType provider;     if (EnumHelper.TryParse<EnumType>(sProvider, out provider))     {     enumsList.Add(provider);     }     }     return enumsList;     }/// <summary> /// Convert each element of collection to string /// </summary> /// <typeparam name="T"></typeparam> /// <param name="objects"></param> /// <returns></returns> public static IEnumerable<string> ToStrings<T>(this IEnumerable<T> objects) {//from http://www.c-sharpcorner.com/Blogs/997/using-linq-to-convert-an-array-from-one-type-to-another.aspx return objects.Select(en => en.ToString()); }

    Read the article

  • SharePoint 2010 – SQL Server has an unsupported version 10.0.2531.0

    - by Jeff Widmer
    I am trying to perform a database attach upgrade to SharePoint Foundation 2010. At this point I am trying to attach the content database to a Web application by using Windows Powershell: Mount-SPContentDatabase -Name <DatabaseName> -DatabaseServer <ServerName> -WebApplication <URL> [-Updateuserexperience] I am following the directions from this TechNet article: Attach databases and upgrade to SharePoint Foundation 2010.  When I go to mount the content database I am receiving this error: Mount-SPContentDatabase : Could not connect to [DATABASE_SERVER] using integrated security: SQL server at [DATABASE_SERVER] has an unsupported version 10.0.2531.0. Please refer to “http://go.microsoft.com/fwlink/?LinkId=165761” for information on the minimum required SQL Server versions and how to download them. At first this did not make sense because the default SharePoint Foundation 2010 website was running just fine.  But then I realized that the default SharePoint Foundation site runs off of SQL Server Express and that I had just installed SQL Server Web Edition (since the database is greater than 4GB) and restored the database to this version of SQL Server. Checking the documentation link above I see that SharePoint Server 2010 requires a 64-bit edition of SQL Server with the minimum required SQL Server versions as follows: SQL Server 2008 Express Edition Service Pack 1, version number 10.0.2531 SQL Server 2005 Service Pack 3 cumulative update package 3, version number 9.00.4220.00 SQL Server 2008 Service Pack 1 cumulative update package 2, version number 10.00.2714.00 The version of SQL Server 2008 Web Edition with Service Pack 1 (the version I installed on this machine) is 10.0.2531.0. SELECT @@VERSION: Microsoft SQL Server 2008 (SP1) - 10.0.2531.0 (X64)   Mar 29 2009 10:11:52   Copyright (c) 1988-2008 Microsoft Corporation  Web Edition (64-bit) on Windows NT 6.1 <X64> (Build 7600: ) (VM) But I had to read the article several times since the minimum version number for SQL Server Express is 10.0.2531.0.  At first I thought I was good with the version of SQL Server 2008 Web that I had installed, also 10.0.2531.0.  But then I read further to see that there is a cumulative update (hotfix) for SQL Server 2008 SP1 (NOT the Express edition) that is required for SharePoint 2010 and will bump the version number to 10.0.2714.00. So the solution was to install the Cumulative update package 2 for SQL Server 2008 Service Pack 1 on my SQL Server 2008 Web Edition to allow SharePoint 2010 to work with SQL Server 2008 (other than the SQL Server 2008 Express version). SELECT @@VERSION (After installing Cumulative update package 2): Microsoft SQL Server 2008 (SP1) - 10.0.2714.0 (X64)   May 14 2009 16:08:52   Copyright (c) 1988-2008 Microsoft Corporation  Web Edition (64-bit) on Windows NT 6.1 <X64> (Build 7600: ) (VM)

    Read the article

  • E: Sub-process /usr/bin/dpkg returned an error code (1) seems to be choking on kde-runtime-data version issue

    - by BMT
    12.04 LTS, on a dell mini 10. Install stable until about a week ago. Updated about 1x a week, sometimes more often. Several days ago, I booted up and the system was no longer working correctly. All these symptoms occurred simultaneously: Cannot run (exit on opening, every time): Update manager, software center, ubuntuOne, libreOffice. Vinagre autostarts on boot, no explanation, not set to startup with Ubuntu. Using apt-get to fix install results in the following: maura@pandora:~$ sudo apt-get -f install Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following package was automatically installed and is no longer required: libtelepathy-farstream2 Use 'apt-get autoremove' to remove them. The following extra packages will be installed: gwibber gwibber-service kde-runtime-data software-center Suggested packages: gwibber-service-flickr gwibber-service-digg gwibber-service-statusnet gwibber-service-foursquare gwibber-service-friendfeed gwibber-service-pingfm gwibber-service-qaiku unity-lens-gwibber The following packages will be upgraded: gwibber gwibber-service kde-runtime-data software-center 4 upgraded, 0 newly installed, 0 to remove and 39 not upgraded. 20 not fully installed or removed. Need to get 0 B/5,682 kB of archives. After this operation, 177 kB of additional disk space will be used. Do you want to continue [Y/n]? debconf: Perl may be unconfigured (Can't locate Scalar/Util.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.14.2 /usr/local/share/perl/5.14.2 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.14 /usr/share/perl/5.14 /usr/local/lib/site_perl .) at /usr/lib/perl/5.14/Hash/Util.pm line 9. BEGIN failed--compilation aborted at /usr/lib/perl/5.14/Hash/Util.pm line 9. Compilation failed in require at /usr/share/perl/5.14/fields.pm line 122. Compilation failed in require at /usr/share/perl5/Debconf/Log.pm line 10. Compilation failed in require at (eval 1) line 4. BEGIN failed--compilation aborted at (eval 1) line 4. ) -- aborting (Reading database ... 242672 files and directories currently installed.) Preparing to replace gwibber 3.4.1-0ubuntu1 (using .../gwibber_3.4.2-0ubuntu1_i386.deb) ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: warning: subprocess old pre-removal script returned error exit status 1 dpkg - trying script from the new package instead ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: error processing /var/cache/apt/archives/gwibber_3.4.2-0ubuntu1_i386.deb (--unpack): subprocess new pre-removal script returned error exit status 1 Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging dpkg: error while cleaning up: subprocess installed post-installation script returned error exit status 1 Preparing to replace gwibber-service 3.4.1-0ubuntu1 (using .../gwibber-service_3.4.2-0ubuntu1_all.deb) ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: warning: subprocess old pre-removal script returned error exit status 1 dpkg - trying script from the new package instead ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: error processing /var/cache/apt/archives/gwibber-service_3.4.2-0ubuntu1_all.deb (--unpack): subprocess new pre-removal script returned error exit status 1 Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging dpkg: error while cleaning up: subprocess installed post-installation script returned error exit status 1 Preparing to replace kde-runtime-data 4:4.8.3-0ubuntu0.1 (using .../kde-runtime-data_4%3a4.8.4-0ubuntu0.1_all.deb) ... Unpacking replacement kde-runtime-data ... dpkg: error processing /var/cache/apt/archives/kde-runtime-data_4%3a4.8.4-0ubuntu0.1_all.deb (--unpack): trying to overwrite '/usr/share/sounds', which is also in package sound-theme-freedesktop 0.7.pristine-2 dpkg-deb (subprocess): subprocess data was killed by signal (Broken pipe) dpkg-deb: error: subprocess <decompress> returned error exit status 2 Preparing to replace python-crypto 2.4.1-1 (using .../python-crypto_2.4.1-1_i386.deb) ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: warning: subprocess old pre-removal script returned error exit status 1 dpkg - trying script from the new package instead ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: error processing /var/cache/apt/archives/python-crypto_2.4.1-1_i386.deb (--unpack): subprocess new pre-removal script returned error exit status 1 No apport report written because MaxReports is reached already Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging dpkg: error while cleaning up: subprocess installed post-installation script returned error exit status 1 Preparing to replace software-center 5.2.2.2 (using .../software-center_5.2.4_all.deb) ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: warning: subprocess old pre-removal script returned error exit status 1 dpkg - trying script from the new package instead ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: error processing /var/cache/apt/archives/software-center_5.2.4_all.deb (--unpack): subprocess new pre-removal script returned error exit status 1 No apport report written because MaxReports is reached already Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging dpkg: error while cleaning up: subprocess installed post-installation script returned error exit status 1 Preparing to replace xdiagnose 2.5 (using .../archives/xdiagnose_2.5_all.deb) ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: warning: subprocess old pre-removal script returned error exit status 1 dpkg - trying script from the new package instead ... Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pyclean", line 25, in <module> import logging ImportError: No module named logging dpkg: error processing /var/cache/apt/archives/xdiagnose_2.5_all.deb (--unpack): subprocess new pre-removal script returned error exit status 1 No apport report written because MaxReports is reached already Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging Error in sys.excepthook: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/apport_python_hook.py", line 64, in apport_excepthook from apport.fileutils import likely_packaged, get_recent_crashes File "/usr/lib/python2.7/dist-packages/apport/__init__.py", line 1, in <module> from apport.report import Report File "/usr/lib/python2.7/dist-packages/apport/report.py", line 16, in <module> from xml.parsers.expat import ExpatError File "/usr/lib/python2.7/xml/parsers/expat.py", line 4, in <module> from pyexpat import * ImportError: No module named pyexpat Original exception was: Traceback (most recent call last): File "/usr/bin/pycompile", line 27, in <module> import logging ImportError: No module named logging dpkg: error while cleaning up: subprocess installed post-installation script returned error exit status 1 Errors were encountered while processing: /var/cache/apt/archives/gwibber_3.4.2-0ubuntu1_i386.deb /var/cache/apt/archives/gwibber-service_3.4.2-0ubuntu1_all.deb /var/cache/apt/archives/kde-runtime-data_4%3a4.8.4-0ubuntu0.1_all.deb /var/cache/apt/archives/python-crypto_2.4.1-1_i386.deb /var/cache/apt/archives/software-center_5.2.4_all.deb /var/cache/apt/archives/xdiagnose_2.5_all.deb E: Sub-process /usr/bin/dpkg returned an error code (1) maura@pandora:~$ ^C maura@pandora:~$

    Read the article

  • Claims-based Identity Terminology

    - by kaleidoscope
    There are several terms commonly used to describe claims-based identity, and it is important to clearly define these terms. · Identity In terms of Access Control, the term identity will be used to refer to a set of claims made by a trusted issuer about the user. · Claim You can think of a claim as a bit of identity information, such as name, email address, age, and so on. The more claims your service receives, the more you’ll know about the user who is making the request. · Security Token The user delivers a set of claims to your service piggybacked along with his or her request. In a REST Web service, these claims are carried in the Authorization header of the HTTP(S) request. Regardless of how they arrive, claims must somehow be serialized, and this is managed by security tokens. A security token is a serialized set of claims that is signed by the issuing authority. · Issuing Authority & Identity Provider An issuing authority has two main features. The first and most obvious is that it issues security tokens. The second feature is the logic that determines which claims to issue. This is based on the user’s identity, the resource to which the request applies, and possibly other contextual data such as time of day. This type of logic is often referred to as policy[1]. There are many issuing authorities, including Windows Live ID, ADFS, PingFederate from Ping Identity (a product that exposes user identities from the Java world), Facebook Connect, and more. Their job is to validate some credential from the user and issue a token with an identifier for the user's account and  possibly other identity attributes. These types of authorities are called identity providers (sometimes shortened as IdP). It’s ultimately their responsibility to answer the question, “who are you?” and ensure that the user knows his or her password, is in possession of a smart card, knows the PIN code, has a matching retinal scan, and so on. · Security Token Service (STS) A security token service (STS) is a technical term for the Web interface in an issuing authority that allows clients to request and receive a security token according to interoperable protocols that are discussed in the following section. This term comes from the WS-Trust standard, and is often used in the literature to refer to an issuing authority. STS when used from developer point of view indicates the URL to use to request a token from an issuer. For more details please refer to the link http://www.microsoft.com/windowsazure/developers/dotnetservices/ Geeta, G

    Read the article

  • Asp.net session on browser close

    - by budugu
    Note: Cross posted from Vijay Kodali's Blog. Permalink How to capture logoff time when user closes browser? Or How to end user session when browser closed? These are some of the frequently asked questions in asp.net forums. In this post I'll show you how to do this when you're building an ASP.NET web application. Before we start, one fact: There is no full-proof technique to catch the browser close event for 100% of time. The trouble lies in the stateless nature of HTTP. The Web server is out of the picture as soon as it finishes sending the page content to the client. After that, all you can rely on is a client side script. Unfortunately, there is no reliable client side event for browser close. Solution: The first thing you need to do is create the web service. I've added web service and named it AsynchronousSave.asmx.    Make this web service accessible from Script, by setting class qualified with the ScriptServiceAttribute attribute...  Add a method (SaveLogOffTime) marked with [WebMethod] attribute. This method simply accepts UserId as a string variable and writes that value and logoff time to text file. But you can pass as many variables as required. You can then use this information for many purposes. To end user session, you can just call Session.Abandon() in the above web method. To enable web service to be called from page’s client side code, add script manager to page. Here i am adding to SessionTest.aspx page When the user closes the browser, onbeforeunload event fires on the client side. Our final step is adding a java script function to that event, which makes web service calls. The code is simple but effective My Code HTML:( SessionTest.aspx ) C#:( SessionTest.aspx.cs ) That’s’ it. Run the application and after browser close, open the text file to see the log off time. The above code works well in IE 7/8. If you have any questions, leave a comment.

    Read the article

  • Asp.net session on browser close

    - by budugu
    Note: Cross posted from Vijay Kodali's Blog. Permalink How to capture logoff time when user closes browser? Or How to end user session when browser closed? These are some of the frequently asked questions in asp.net forums. In this post I'll show you how to do this when you're building an ASP.NET web application. Before we start, one fact: There is no full-proof technique to catch the browser close event for 100% of time. The trouble lies in the stateless nature of HTTP. The Web server is out of the picture as soon as it finishes sending the page content to the client. After that, all you can rely on is a client side script. Unfortunately, there is no reliable client side event for browser close. Solution: The first thing you need to do is create the web service. I've added web service and named it AsynchronousSave.asmx.    Make this web service accessible from Script, by setting class qualified with the ScriptServiceAttribute attribute...  Add a method (SaveLogOffTime) marked with [WebMethod] attribute. This method simply accepts UserId as a string variable and writes that value and logoff time to text file. But you can pass as many variables as required. You can then use this information for many purposes. To end user session, you can just call Session.Abandon() in the above web method. To enable web service to be called from page’s client side code, add script manager to page. Here i am adding to SessionTest.aspx page When the user closes the browser, onbeforeunload event fires on the client side. Our final step is adding a java script function to that event, which makes web service calls. The code is simple but effective My Code HTML:( SessionTest.aspx ) C#:( SessionTest.aspx.cs ) That’s’ it. Run the application and after browser close, open the text file to see the log off time. The above code works well in IE 7/8. If you have any questions, leave a comment.

    Read the article

  • Sesame update du jour: SL 4, OOB, Azure, and proxy support

    - by Fabrice Marguerie
    I've just published a new version of Sesame Data Browser. Here's what's new this time: Upgraded to Silverlight 4 Can run out-of-browser (OOB), with elevated permissions. This gives you an icon on your desktop and enables new scenarios. Note: The application is unsigned for the moment. Support for Windows Azure authentication Support for SQL Azure authentication If you are behind a proxy that requires authentication, just give Sesame a new try after clicking on "If you are behind a proxy that requires authentication, please click here" An icon and a button for closing connections are now displayed on connection tabsSome less visible improvements Here is the connection view with anonymous access: If you want to access Windows Azure tables as OData, all you have to do is use your table storage endpoint as the URL, and provide your access key: A Windows Azure table storage address looks like this: http://<your account>.table.core.windows.net/ If you want to browse your SQL Azure databases with Sesame, you have to enable OData support for them at https://www.sqlazurelabs.com/ConfigOData.aspx. I won't show how it works because it's already been done in several places over the Web. Here are pointers: OData.org: Got SQL Azure? Then you've got OData OakLeaf Systems: Enabling and Using the OData Protocol with SQL Azure Patrick Verbruggen: Creating an OData feed for your Azure databases Shawn Wildermuth: SQL Azure's OData Support Jack Greenfield: How to Use OData for SQL Azure with AppFabric Access Control You can choose to enable anonymous access or not. When you don't enable anonymous access, you have to provide an Issuer name and a Secret key, and optionally an Security Token Service (STS) endpoint: Excerpt from Jack Greenfield's blog: To enable OData access to the currently selected database, check the box labeled "Enable OData". When OData access is enabled, database user mapping information is displayed at the bottom of the form.Use the drop down list labeled "Anonymous Access User" to select an anonymous access user. If an anonymous access user is selected, then all queries against the database presented without credentials will execute by impersonating that user. You can access the database as the anonymous user by clicking on the link provided at the bottom of the page. If no anonymous access user is selected, then the OData Service will not allow anonymous access to the database.Click the link labeled "Add User" to add a user for authenticated access. In the pop up panel, select the user from the drop down list. Leave the issuer name empty for simple authentication, or provide the name of a trusted Security Token Service (STS) for federated authentication. For example, to federate with another ACS based STS, provide the base URI for the STS endpoint displayed by the Windows Azure AppFabric Portal for the STS.Click the "OK" button to complete the configuration process and dismiss the pop up panel. When one or more authenticated access users are added, the OData Service will impersonate them when appropriate credentials are presented. You can designate as many authenticated access users as you like. The OData Service will decide which one to impersonate for each query by inspecting the credentials presented with the query.Next time I'll give an overview of how Sesame Data Browser is built.In the meantime, happy data browsing!

    Read the article

  • Cloud Computing : publication du volet 3 du Syntec Numérique

    - by Eric Bezille
    Une vision client/fournisseur réunie autour d'une ébauche de cadre contractuel Lors de la Cloud Computing World Expo qui se tenait au CNIT la semaine dernière, j'ai assisté à la présentation du nouveau volet du Syntec numérique sur le Cloud Computing et les "nouveaux modèles" induits : modèles économiques, contrats, relations clients-fournisseurs, organisation de la DSI. L'originalité de ce livre blanc vis à vis de ceux déjà existants dans le domaine est de s'être attaché à regrouper l'ensemble des acteurs clients (au travers du CRIP) et fournisseurs, autour d'un cadre de formalisation contractuel, en s'appuyant sur le modèle e-SCM. Accélération du passage en fournisseur de Services et fin d'une IT en silos ? Si le Cloud Computing permet d'accélérer le passage de l'IT en fournisseur de services (dans la suite d'ITIL v3), il met également en exergue le challenge pour les DSI d'un modèle en rupture nécessitant des compétences transverses permettant de garantir les qualités attendues d'un service de Cloud Computing : déploiement en mode "self-service" à la demande, accès standardisé au travers du réseau,  gestion de groupes de ressources partagées,  service "élastique" : que l'on peut faire croitre ou diminuer rapidement en fonction de la demande mesurable On comprendra bien ici, que le Cloud Computing va bien au delà de la simple virtualisation de serveurs. Comme le décrit fort justement Constantin Gonzales dans son blog ("Three Enterprise Principles for Building Clouds"), l'important réside dans le respect du standard de l'interface d'accès au service. Ensuite, la façon dont il est réalisé (dans le nuage), est de la charge et de la responsabilité du fournisseur. A lui d'optimiser au mieux pour être compétitif, tout en garantissant les niveaux de services attendus. Pour le fournisseur de service, bien entendu, il faut maîtriser cette implémentation qui repose essentiellement sur l'intégration et l'automatisation des couches et composants nécessaires... dans la durée... avec la prise en charge des évolutions de chacun des éléments. Pour le client, il faut toujours s'assurer de la réversibilité de la solution au travers du respect des standards... Point également abordé dans le livre blanc du Syntec, qui rappelle les points d'attention et fait un état des lieux de l'avancement des standards autour du Cloud Computing. En vous souhaitant une bonne lecture...

    Read the article

  • Oracle SOA Security for OUAF Web Services

    - by Anthony Shorten
    With the ability to use Oracle SOA Suite 11g with the Oracle Utilities Application Framework based products, an additional consideration needs to be configured to ensure correct integration. That additional consideration is security. By default, SOA Suite propagates any credentials from the calling application through to the interfacing applications. In most cases, this behavior is not appropriate as the calling application may use different credential stores and also some interfaces are “disconnected” from a calling application (for example, a file based load using the File Adapter). These situations require that the Web Service calls to the Oracle Utilities Application Framework based products have their own valid credentials. To do this the credentials must be attached at design time or at run time to provide the necessary credentials for the call. There are a number of techniques that can be used to do this: At design time, when integrating a Web Service from an Oracle Utilities Application Framework based product you can attach the security policy “oracle/wss_username_token_client_policy” in the composite.xml view. In this view select the Web Service you want to attach the policy to and right click to display the context menu and select “Configure WS Policies” and select the above policy from the list. If you are using SSL then you can use “oracle/wss_username_token_over_ssl_client_policy” instead. At design time, you can also specify the credential key (csf-key) associated with the above policy by selecting the policy and clicking “Edit Config Override Properties”. You name the key appropriately. Everytime the SOA components are deployed the credential configuration is also sent. You can also do this after deployment, or what I call at “runtime”, by specifying the policy and credential key in the Fusion Middleware Control. Refer to the Fusion Middleware Control documentation on how to do this. To complete the configuration you need to add a map and the key specified earlier to the credential store in the Oracle WebLogic instance used for Oracle SOA Suite. From Fusion Middleware Control, you do this by selecting the domain the SOA Suite is installed in a select “Credentials” from the context menu. You now need to add the credentials by adding the map “oracle.wsm.security” (the name is IMPORTANT) and creating a key with the necessary valid credentials. The example below added a key called “mdm.key”. The name I used is for example only. You can name the key anything you like as long as it corresponds to the key you specified in the design time component. Note: I used SYSUSER as an example credentials in the example, in real life you would use another credential as SYSUSER is not appropriate for production use. This key can be reused for other Oracle Utilities Application Framework Web Service integrations or you can use other keys for individual Web Service calls. Once the key is created and the SOA Suite components deployed the transactions should be able to be called as necessary. If you need to change the password for the credentials it can be done using the Fusion Middleware Control functionality.

    Read the article

  • Issues with timed out downloads via TomCat?

    - by Ira Baxter
    We get, in our opinion, a lot of failed download attempts and want to understand why. We offer downloads via an email link (typical): http://www.semanticdesigns.com/deliverEval/<productname> This is processed by Tomcat on Linux via a jsp file, with the following code: response.addHeader( "Content-Disposition", "attachment; filename=" + fileTail ); response.addHeader( "Content-Type", "application/x-msdos-program" ); byte[] buf = new byte[8192]; int read; try { java.io.FileInputStream input = new java.io.FileInputStream( filename ); java.io.OutputStream o = response.getOutputStream(); while( ( read = input.read( buf, 0, 8192 ) ) != -1 ){ o.write( buf, 0, read ); } o.flush(); } catch( Exception e ){ util.fatalError( request.getRequestURI(), "Error sending file '" + filename + "' to client", e ); throw e; } We get a lot of reported errors (about 50% error rate): URI --- /deliverEval/download.jsp Code Message: Error sending file '/home/sd/ShippingMasters/DMS/Domains/C/GCC3/Tools/TestCoverage/SD_C~GCC3_TestCoverage.1.6.12.exe' to client Stack Trace ----------- null at org.apache.coyote.tomcat5.OutputBuffer.realWriteBytes(byte[], int, int) (Unknown Source) at org.apache.tomcat.util.buf.ByteChunk.append(byte[], int, int) (Unknown Source) at org.apache.coyote.tomcat5.OutputBuffer.writeBytes(byte[], int, int) (Unknown Source) at org.apache.coyote.tomcat5.OutputBuffer.write(byte[], int, int) (Unknown Source) at org.apache.coyote.tomcat5.CoyoteOutputStream.write(byte[], int, int) (Unknown Source) at org.apache.jsp.deliverEval.download_jsp._jspService(javax.servlet.http.HttpServletRequest, javax.servlet.http.HttpServletResponse) (Unknown Source) at org.apache.jasper.runtime.HttpJspBase.service(javax.servlet.http.HttpServletRequest, javax.servlet.http.HttpServletResponse) (Unknown Source) at javax.servlet.http.HttpServlet.service(javax.servlet.ServletRequest, javax.servlet.ServletResponse) (Unknown Source) at org.apache.jasper.servlet.JspServletWrapper.service(javax.servlet.http.HttpServletRequest, javax.servlet.http.HttpServletResponse, boolean) (Unknown Source) at org.apache.jasper.servlet.JspServlet.serviceJspFile(javax.servlet.http.HttpServletRequest, javax.servlet.http.HttpServletResponse, java.lang.String, java.lang.Throwable, boolean) (Unknown Source) at org.apache.jasper.servlet.JspServlet.service(javax.servlet.http.HttpServletRequest, javax.servlet.http.HttpServletResponse) (Unknown Source) at javax.servlet.http.HttpServlet.service(javax.servlet.ServletRequest, javax.servlet.ServletResponse) (Unknown Source) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse) (Unknown Source) at org.apache.catalina.core.ApplicationFilterChain.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse) (Unknown Source) at org.apache.catalina.core.StandardWrapperValve.invoke(org.apache.catalina.Request, org.apache.catalina.Response, org.apache.catalina.ValveContext) (Unknown Source) We don't understand why this rate should be so high. Is there any way to get more information about the cause of the error? It is useful to know that these are pretty big documents, 3-50 megabytes. They reside on the Linux server so reading them is just a local disk read, and is unlikely to be a contributor to the problem. But sheer size might be an issue for the recipients browser? Is this kind of error rate typical for downloads? My personal experience downloading other's documents suggests no; our internal attempts show this to be very reliable, but we're operating on our internal network for such experiments so we're missing the complexity of the intervening internet.

    Read the article

  • A Complete Customer Experience Solution (3 of 3 in 'No Customer Left Behind' Series)

    - by Kathryn Perry
    A guest post by David Vap, Group Vice President, Oracle Applications Product Development In my previous post, I talked about taking three concrete steps to improve your customers' overall experiences: 1) understand your customer, 2) empower your ecosystem, and 3) adapt your business. To do these effectively and efficiently, it's important to find the right technology that can bridge the gaps across your channels, interactions, departments, and repositories. Oracle has spent the past three years and more than six billion dollars acquiring and developing some of the world's best-of-breed applications. The result is the most comprehensive customer experience (CX) portfolio offering in the World - bar none: ATG Best in Class Selling Experiences Fatwire Best in Class Marketing Experiences Inquira Best in Class Support Experiences Endecca Best in Class Search Experiences RightNow Best in Class Service Experiences Vitrue & Involver Best in Class Social Marketing Collective Intellect Best In Class Social Listening We don't expect organizations to eat the CX elephant in one bite, nor should they try to. There are key strategic initiatives within each of the four main pillars of our customer experience offering for which we deliver solutions: 1. Customer Experience for Marketing Social Listening and Engagement Social Marketing Marketing Websites Demand Generation and Lead Management Marketing and Loyalty Management 2. Customer Experience for Commerce Search, Navigation & Content Delivery Cross-Channel Commerce Targeting & Product Recommendations Social Commerce Order Management & Fulfillment Retail Store Operations 3. Customer Experience for Sales Sales Force Automation Social Selling Territory & Quota Management Revenue Forecasting Partner Relationship Management Quote to Cash Incentive Compensation 4. Customer Experience for Service Cross-Channel Customer Service Knowledge Management Social Customer Service Eligibility Management Contracts, Assets, and Entitlements Industry-Specific Solutions eBilling Oracle's customer experience portfolio is socially infused at each layer of our pillars rather than simply bolted on as a side process. This combines with the power of the Cloud to run the parts of the solution that need the access, efficiency, and agility from a managed infrastructure. You can get the compliance control from on-premise backbone infrastructure systems that run your business and don't change that often. Please take advantage of our teams of Oracle customer experience professionals and our key agency and technology partner ecosystem. They can help you develop strategic solution roadmaps that build and deliver customer experience and that are tailored to your business needs and objectives. No one has built a better customer service portfolio to manage the entire customer journey than Oracle. It is backed by CX thought leadership programs, a commitment from our executives, and a worldview that your technology decisions must be driven by your customer experiences to succeed. If you’d like to follow up on this conversation, please leave a comment or contact me at [email protected]. You can get more information on Oracle’s complete customer experience solution here.

    Read the article

  • Do you care about your Oracle System Support experience?

    - by user12244613
    It has been a while since I blogged about Systems Support within Oracle. I want to take this opportunity to raise awareness of how Oracle is communicating out to its systems customers. Previously every item to be communicated was sent independently via an email message however, not all messages appear to be being getting the attention they require. In an effort to ensure Oracle is reaching all of our Sun and Oracle System customers, we have created the Oracle Systems Support Newsletter. This monthly newsletter will have a summary of customer support relevant information for you to use and will cover topics that impact your support experience. For example: 1. Did you know that sending explorer content to email addresses with @sun.com is going away soon? For more information, review the Document 1362484.1 2. Are you an Auto Service Request (ASR) user? If yes, here are the latest changes: · ASR Manager accepts My Oracle Support User Name (email address) and password. [Doc ID 1345484.1] · ASR IP Address for secure file transfer has changed [Doc ID 1338575.1] · ASR No Heartbeat Status - Find out how to resolve [Doc ID 1346328.1] 3. Did you notice we have changed the Service Request options for Hardware and introduced a new problem category called “Automated Diagnosis”? This service streamlines the data you send in and then automatically provides an update of known issues found in your My Oracle Support Service Request. This feature also fast tracks hardware failures by sending parts as soon as the data is analyzed. Have you used this new feature? If yes tell us about it – take the 5minute survey 4. Are you being proactive or are you still ‘fire fighting’ in the reactive mode? If you are being proactive for your Oracle System products you might have used Oracle Sun System Analysis. Did you finding this helpful? Can we improve it? You tell us, take the 5minute survey 5. Are you aware that if you attach files to your Service Request it enables the support engineer to start work straight away? For a summary of products and files review the Newsletter. 6. Are you struggling to find patches or firmware or product downloads? If yes, these types of issues are all addressed in the Newsletter. If this is the type of information you want to know about each month, then take time to read the Newsletter link and bookmark it in My Oracle Support so you can stay informed. Thanks for your time.

    Read the article

  • When following SRP, how should I deal with validating and saving entities?

    - by Kristof Claes
    I've been reading Clean Code and various online articles about SOLID lately, and the more I read about it, the more I feel like I don't know anything. Let's say I'm building a web application using ASP.NET MVC 3. Let's say I have a UsersController with a Create action like this: public class UsersController : Controller { public ActionResult Create(CreateUserViewModel viewModel) { } } In that action method I want to save a user to the database if the data that was entered is valid. Now, according to the Single Responsibility Principle an object should have a single responsibility, and that responsibility should be entirely encapsulated by the class. All its services should be narrowly aligned with that responsibility. Since validation and saving to the database are two separate responsibilities, I guess I should create to separate class to handle them like this: public class UsersController : Controller { private ICreateUserValidator validator; private IUserService service; public UsersController(ICreateUserValidator validator, IUserService service) { this.validator = validator; this.service= service; } public ActionResult Create(CreateUserViewModel viewModel) { ValidationResult result = validator.IsValid(viewModel); if (result.IsValid) { service.CreateUser(viewModel); return RedirectToAction("Index"); } else { foreach (var errorMessage in result.ErrorMessages) { ModelState.AddModelError(String.Empty, errorMessage); } return View(viewModel); } } } That makes some sense to me, but I'm not at all sure that this is the right way to handle things like this. It is for example entirely possible to pass an invalid instance of CreateUserViewModel to the IUserService class. I know I could use the built in DataAnnotations, but what when they aren't enough? Image that my ICreateUserValidator checks the database to see if there already is another user with the same name... Another option is to let the IUserService take care of the validation like this: public class UserService : IUserService { private ICreateUserValidator validator; public UserService(ICreateUserValidator validator) { this.validator = validator; } public ValidationResult CreateUser(CreateUserViewModel viewModel) { var result = validator.IsValid(viewModel); if (result.IsValid) { // Save the user } return result; } } But I feel I'm violating the Single Responsibility Principle here. How should I deal with something like this?

    Read the article

  • When to use SOAP over REST

    So, how does REST based services differ from SOAP based services, and when should you use SOAP? Representational State Transfer (REST) implements the standard HTTP/HTTPS as an interface allowing clients to obtain access to resources based on requested URIs. An example of a URI may look like this http://mydomain.com/service/method?parameter=var1&parameter=var2. It is important to note that REST based services are stateless because http/https is natively stateless. One of the many benefits for implementing HTTP/HTTPS as an interface is can be found in caching. Caching can be done on a web service much like caching is done on requested web pages. Caching allows for reduced web server processing and increased response times because content is already processed and stored for immediate access. Typical actions performed by REST based services include generic CRUD (Create, Read, Update, and Delete) operations and operations that do not require state. Simple Object Access Protocol (SOAP) on the other hand uses a generic interface in order to transport messages. Unlike REST, SOAP can use HTTP/HTTPS, SMTP, JMS, or any other standard transport protocols. Furthermore, SOAP utilizes XML in the following ways: Define a message Defines how a message is to be processed Defines the encoding of a message Lays out procedure calls and responses As REST aligns more with a Resource View, SOAP aligns more with a Method View in that business logic is exposed as methods typically through SOAP web service because they can retain state. In addition, SOAP requests are not cached therefore every request will be processed by the server. As stated before Soap does retain state and this gives it a special advantage over REST for services that need to preform transactions where multiple calls to a service are need in order to complete a task. Additionally, SOAP is more ideal for enterprise level services that implement standard exchange formats in the form of contracts due to the fact that REST does not currently support this. A real world example of where SOAP is preferred over REST can be seen in the banking industry where money is transferred from one account to another. SOAP would allow a bank to perform a transaction on an account and if the transaction failed, SOAP would automatically retry the transaction ensuring that the request was completed. Unfortunately, with REST, failed service calls must be handled manually by the requesting application. References: Francia, S. (2010). SOAP vs. REST. Retrieved 11 20, 2011, from spf13: http://spf13.com/post/soap-vs-rest Rozlog, M. (2010). REST and SOAP: When Should I Use Each (or Both)? Retrieved 11 20, 2011, from Infoq.com: http://www.infoq.com/articles/rest-soap-when-to-use-each

    Read the article

  • Code review recommendations and Code Smells

    - by Michael Freidgeim
    Some time ago Twitter told that I am similar to Boris Lipschitz . Indeed he is also .Net programmer from Russia living in Australia. I‘ve read his list of Code Review points and found them quite comprehensive. A few points  were not clear for me, and it forced me for a further reading.In particular the statement “Exception should not be used to return a status or an error code.” wasn’t fully clear for me, because sometimes we store an exception as an object with all error details and I believe it’s a valid approach. However I agree that throwing exceptions should be avoided, if you expect to return error as a part of a normal flow. Related link: http://codeutopia.net/blog/2010/03/11/should-a-failed-function-return-a-value-or-throw-an-exception/ Another point slightly puzzled me“If Thread.Sleep() is used, can it be replaced with something else, ei Timer, AutoResetEvent, etc” . I believe, that there are very rare cases, when anyone using Thread.Sleep in any production code. Usually it is used in mocks and prototypes.I had to look further to clarify “Dependency injection is used instead of Service Location pattern”.Even most of articles has some preferences to Dependency injection, there are also advantages to use Service Location. E.g see http://geekswithblogs.net/KyleBurns/archive/2012/04/27/dependency-injection-vs.-service-locator.aspx. http://www.cookcomputing.com/blog/archives/000587.html  refers to Concluding Thoughts of Martin Fowler The choice between Service Locator and Dependency Injection is less important than the principle of separating service configuration from the use of services within an applicationThe post had a link to excellent article Code Smells of Jeff Atwood, but the statement, that “code should not pass a review if it violates any of the  code smells” sound too strict for my environment. In particular, I disagree with “Dead Code” recommendation “Ruthlessly delete code that isn't being used. That's why we have source control systems!”. If there is a chance that not used code will be required in a future, it is convenient to keep it as commented or #if/#endif blocks with appropriate explanation, why it could be required in the future. TFS is a good source control system, but context search in source code of current solution is much easier than finding something in the previous versions of the code.I also found a link to a good book “Clean Code.A.Handbook.of.Agile.Software”

    Read the article

  • Principles of an extensible data proxy

    - by Wesley
    There is a growing industry now with more than 30 companies playing in the Backend-As-A-Service (BaaS) market. The principle is simple: give companies a secure way of exposing data housed on premises and behind the firewall publicly. This can include database data, as well as Legacy PC data through established connectors; SAP for example provides a connector for transacting with their legacy systems. Early attempts were fixed providers for specific systems like SAP, IBM or Oracle, but the new breed is extensible, allowing Channel Partners and Consultants to build robust integration applications that can consume whatever data sources the client wants to expose. I just happen to be close to finishing a Cloud Based HTML5 application platform that provides robust integration services, and I would like to break ground on an extensible data proxy to complete the system. From what I can gather, I need to provide either an installable web service of some kind, or a Cloud service which the client can configure with VPN for interactions. Then I can build in connectors, which can be activated with a service account, and expose those transactions via web services of some kind (JSON, SOAP, etc). I can also provide a framework that allows people to build in their own connectors, and use some kind of schema to hook those connectors into the proxy. The end result is some kind of public facing web service that could securely be consumed by applications to show data through HTML5 on any device. My gut is, this isn't as hard as it sounds. Almost all of the 30+ companies (With more popping up almost weekly) have all come into existence in the last 18 months or so, which tells me either the root technology, or the skillset to create the technology is in abundance right now. Where should I start on this? Are there some open source projects I can leverage? A specific group of developers I can hire? I'm confident someone here can set me on the right path and save me some time. You don't see this many companies spring up this rapidly if they are all starting from scratch with proprietary technology. The Register: WTF is BaaS One Minute Video from Kony on their BaaS

    Read the article

  • SOA Community Newsletter September 2012

    - by JuergenKress
    Dear SOA partner community member Are you ready for Oracle Open World 2012? If you are planning to attend, make sure that you prepare your trip to San Francisco. If you could not make it, watch the keynotes live on-demand. You can also plan and decide to visit the SOA, Cloud and Service Technology Symposium 2012 and meet Tim Hall and Demed Lher from our product management team in London. As an Oracle partner you will get 50% discount on the conference pass, please use the code DJMXZ370 and avail your discount. The BPM Solution Catalogue is now live, make sure you use the process examples and contribute your processes. SOA Proactive support is the best resource to support your SOA implementations. To administrate your SOA systems Enterprise Manager Cloud Control 12c is the best tool, you can now attend thefree on-demand training. EM12c, Real User Experience Insight 12R1 gives you all the details, checkout our new demo. The BPM11g demo for Oracle E-Business Suite has become available. A wonderful SOA demo case is the Fusion Order Demo, Antony Reynolds posted an article how to update it on SOA Suite PS5. If you do use Coherence e.g. for SOA Suite, checkout the extension from our partner CloudTran. In this edition to this you will also find articles from: Automatically Disable Proxy Service to avoid overloading OSB By Jian Liang & Storing SCA Metadata in the Oracle Metadata Services Repository by Nicolás Fonnegra Martinez and Markus Lohn & Exploring MDS Explorer by Mark Nelson & Using Cloud OER to Find Fusion Applications On-Premise Service Concrete WSDL URL by Rajesh Raheja & Oracle Service Bus duplicate message check using Coherence by Jan van Zoggel & Installing Oracle SOA Suite10g on Oracle Enterprise Linux Lonneke Dikmans & Generating an EJB SDO Service Interface for Oracle SOA Suite by Edwin Biemond. Jürgen Kress Oracle SOA & BPM Partner Adoption EMEA To read the newsletter please visit http://tinyurl.com/soanewsSeptember2012 (OPN Account required) To become a member of the SOA Partner Community please register at http://www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA Community newsletter,SOA Community,Oracle SOA,Oracle BPM,BPM Community,OPN,Jürgen Kress

    Read the article

  • Why is my system freezing when I switch users

    - by ZeroDivide
    Hello I've recently upgraded from 13.04 to 13.10 64bit. I'm running AMD graphics with the proprietary drivers. I have two user accounts. Mine(administrator) and my girlfriend's(standard) My girlfriend clicks "switch user" from my lock screen and logs in fine. I then try to click "switch user" from her lock screen and everything goes black. Then the monitor blinks on and off with just a single cursor. I have no way to access the terminal, the system is unresponsive and I have to hit the power button. Even ctrl + alt + f4 or ctrl + alt + t doesn't get me a terminal. When I press the power button on my system, it does start printing out the shutdown sequence on the monitor. Here is my .xsession-errors Script for ibus started at run_im. Script for auto started at run_im. Script for default started at run_im. Here is hers: init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd main process ended, respawning init: at-spi2-registryd respawning too fast, stopped init: logrotate main process (4726) killed by TERM signal init: upstart-dbus-session-bridge main process (4865) terminated with status 1 init: gnome-settings-daemon main process (4843) terminated with status 1 init: gnome-session main process (4852) terminated with status 1 init: unity-panel-service main process (4863) killed by KILL signal I found some advice in a forum to look for at-spi2-registryd in my system logs. Perhaps it will be useful. executing this: sudo grep -r at-spi2-registryd /var/log/* produces this: /var/log/lightdm/x-1-greeter.log:** (at-spi2-registryd:4384): WARNING **: Failed to register client: GDBus.Error:org.freedesktop.DBus.Error.ServiceUnknown: The name org.gnome.SessionManager was not provided by any .service files /var/log/lightdm/x-1-greeter.log:** (at-spi2-registryd:4384): WARNING **: Unable to register client with session manager /var/log/lightdm/x-2-greeter.log.old:** (at-spi2-registryd:7447): WARNING **: Failed to register client: GDBus.Error:org.freedesktop.DBus.Error.ServiceUnknown: The name org.gnome.SessionManager was not provided by any .service files /var/log/lightdm/x-2-greeter.log.old:** (at-spi2-registryd:7447): WARNING **: Unable to register client with session manager /var/log/lightdm/x-0-greeter.log:** (at-spi2-registryd:1378): WARNING **: Failed to register client: GDBus.Error:org.freedesktop.DBus.Error.ServiceUnknown: The name org.gnome.SessionManager was not provided by any .service files /var/log/lightdm/x-0-greeter.log:** (at-spi2-registryd:1378): WARNING **: Unable to register client with session manager /var/log/lightdm/x-0-greeter.log.old:** (at-spi2-registryd:1357): WARNING **: Failed to register client: GDBus.Error:org.freedesktop.DBus.Error.ServiceUnknown: The name org.gnome.SessionManager was not provided by any .service files /var/log/lightdm/x-0-greeter.log.old:** (at-spi2-registryd:1357): WARNING **: Unable to register client with session manager Any ideas what is going on?

    Read the article

  • Java JRE 1.6.0_35 Certified with Oracle E-Business Suite

    - by Steven Chan (Oracle Development)
    The latest Java Runtime Environment 1.6.0_35 (a.k.a. JRE 6u35-b10) is now certified with Oracle E-Business Suite Release 11i and 12 desktop clients.   What's new in Java 1.6.0_35?See the 1.6.0_35 Update Release Notes for details about what has changed in this release.  This release is available for download from the usual Sun channels and through the 'Java Automatic Update' mechanism. 32-bit and 64-bit versions certified This certification includes both the 32-bit and 64-bit JRE versions. 32-bit JREs are certified on: Windows XP Service Pack 3 (SP3) Windows Vista Service Pack 1 (SP1) and Service Pack 2 (SP2) Windows 7 and Windows 7 Service Pack 1 (SP1) 64-bit JREs are certified only on 64-bit versions of Windows 7 and Windows 7 Service Pack 1 (SP1). Worried about the 'mismanaged session cookie' issue? No need to worry -- it's fixed.  To recap: JRE releases 1.6.0_18 through 1.6.0_22 had issues with mismanaging session cookies that affected some users in some circumstances. The fix for those issues was first included in JRE 1.6.0_23. These fixes will carry forward and continue to be fixed in all future JRE releases.  In other words, if you wish to avoid the mismanaged session cookie issue, you should apply any release after JRE 1.6.0_22.All JRE 1.6 releases are certified with EBS upon release Our standard policy is that all E-Business Suite customers can apply all JRE updates to end-user desktops from JRE 1.6.0_03 and later updates on the 1.6 codeline.  We test all new JRE 1.6 releases in parallel with the JRE development process, so all new JRE 1.6 releases are considered certified with the E-Business Suite on the same day that they're released by our Java team.  You do not need to wait for a certification announcement before applying new JRE 1.6 releases to your EBS users' desktops. Important For important guidance about the impact of the JRE Auto Update feature on JRE 1.6 desktops, see: URGENT BULLETIN: All E-Business Suite End-Users Must Manually Apply JRE 6 Updates References Recommended Browsers for Oracle Applications 11i (Metalink Note 285218.1) Upgrading Sun JRE (Native Plug-in) with Oracle Applications 11i for Windows Clients (Metalink Note 290807.1) Recommended Browsers for Oracle Applications 12 (MetaLink Note 389422.1) Upgrading JRE Plugin with Oracle Applications R12 (MetaLink Note 393931.1) Related Articles Mismanaged Session Cookie Issue Fixed for EBS in JRE 1.6.0_23 Roundup: Oracle JInitiator 1.3 Desupported for EBS Customers in July 2009

    Read the article

< Previous Page | 242 243 244 245 246 247 248 249 250 251 252 253  | Next Page >