Search Results

Search found 10627 results on 426 pages for 'smart status'.

Page 196/426 | < Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >

  • Accessibility Update: The Oracle ETPM v2.3 VPAT is now available on oracle.com

    - by Rick Finley
    Oracle Tax is committed to building accessible applications.  Oracle uses a VPAT (Voluntary Product Accessibility Template) to document the accessibility status of each product.  The VPAT was created by a partnership of the Information Technology Industry Council (ITI) and the U.S. General Services Administration (GSA) to create a simple document that could be used by US Federal contracting and procurement officials to evaluate a product with respect to the provisions contained in Section 508 of the Americans with Disabilities Act (ADA). The Oracle ETPM v2.3 VPAT is now available on oracle.com:  http://www.oracle.com/us/corporate/accessibility/templates/t2455.html    

    Read the article

  • While installing updates I get "Package operation Failed" for ubuntu 12.04

    - by user54395
    i get the following response in the details:- Please help nstallArchives() failed: perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = (unset), LC_ALL = (unset), LANG = "en_IN.ISO8859-1" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). locale: Cannot set LC_CTYPE to default locale: No such file or directory locale: Cannot set LC_MESSAGES to default locale: No such file or directory locale: Cannot set LC_ALL to default locale: No such file or directory perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = (unset), LC_ALL = (unset), LANG = "en_IN.ISO8859-1" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). locale: Cannot set LC_CTYPE to default locale: No such file or directory locale: Cannot set LC_MESSAGES to default locale: No such file or directory locale: Cannot set LC_ALL to default locale: No such file or directory perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = (unset), LC_ALL = (unset), LANG = "en_IN.ISO8859-1" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). locale: Cannot set LC_CTYPE to default locale: No such file or directory locale: Cannot set LC_MESSAGES to default locale: No such file or directory locale: Cannot set LC_ALL to default locale: No such file or directory perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = (unset), LC_ALL = (unset), LANG = "en_IN.ISO8859-1" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). locale: Cannot set LC_CTYPE to default locale: No such file or directory locale: Cannot set LC_MESSAGES to default locale: No such file or directory locale: Cannot set LC_ALL to default locale: No such file or directory (Reading database ... (Reading database ... 5%% (Reading database ... 10%% (Reading database ... 15%% (Reading database ... 20%% (Reading database ... 25%% (Reading database ... 30%% (Reading database ... 35%% (Reading database ... 40%% (Reading database ... 45%% (Reading database ... 50%% (Reading database ... 55%% (Reading database ... 60%% (Reading database ... 65%% (Reading database ... 70%% (Reading database ... 75%% (Reading database ... 80%% (Reading database ... 85%% (Reading database ... 90%% (Reading database ... 95%% (Reading database ... 100%% (Reading database ... 427340 files and directories currently installed.) Preparing to replace thunderbird-trunk-globalmenu 14.0~a1~hg20120409r9862.91177-0ubuntu1~umd1 (using .../thunderbird-trunk-globalmenu_14.0~a1~hg20120409r9866.91235-0ubuntu1~umd1_i386.deb) ... Unpacking replacement thunderbird-trunk-globalmenu ... Preparing to replace thunderbird-trunk 14.0~a1~hg20120409r9862.91177-0ubuntu1~umd1 (using .../thunderbird-trunk_14.0~a1~hg20120409r9866.91235-0ubuntu1~umd1_i386.deb) ... Unpacking replacement thunderbird-trunk ... Processing triggers for man-db ... locale: Cannot set LC_CTYPE to default locale: No such file or directory locale: Cannot set LC_MESSAGES to default locale: No such file or directory locale: Cannot set LC_ALL to default locale: No such file or directory Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for gnome-menus ... Processing triggers for desktop-file-utils ... Setting up crossplatformui (1.0.27) ... Rather than invoking init scripts through /etc/init.d, use the service(8) utility, e.g. service acpid restart Since the script you are attempting to invoke has been converted to an Upstart job, you may also use the stop(8) and then start(8) utilities, e.g. stop acpid ; start acpid. The restart(8) utility is also available. acpid stop/waiting acpid start/running, process 5286 package libqtgui4 exist QT_VERSION = 4 make -C /lib/modules/3.2.0-22-generic/build M=/usr/local/bin/ztemtApp/zteusbserial/below2.6.27 modules make[1]: Entering directory /usr/src/linux-headers-3.2.0-22-generic' CC [M] /usr/local/bin/ztemtApp/zteusbserial/below2.6.27/usb-serial.o /usr/local/bin/ztemtApp/zteusbserial/below2.6.27/usb-serial.c:34:28: fatal error: linux/smp_lock.h: No such file or directory compilation terminated. make[2]: *** [/usr/local/bin/ztemtApp/zteusbserial/below2.6.27/usb-serial.o] Error 1 make[1]: *** [_module_/usr/local/bin/ztemtApp/zteusbserial/below2.6.27] Error 2 make[1]: Leaving directory/usr/src/linux-headers-3.2.0-22-generic' make: * [modules] Error 2 dpkg: error processing crossplatformui (--configure): subprocess installed post-installation script returned error exit status 2 No apport report written because MaxReports is reached already Setting up thunderbird-trunk (14.0~a1~hg20120409r9866.91235-0ubuntu1~umd1) ... Setting up thunderbird-trunk-globalmenu (14.0~a1~hg20120409r9866.91235-0ubuntu1~umd1) ... Errors were encountered while processing: crossplatformui Error in function: SystemError: E:Sub-process /usr/bin/dpkg returned an error code (1) Setting up crossplatformui (1.0.27) ... Rather than invoking init scripts through /etc/init.d, use the service(8) utility, e.g. service acpid restart Since the script you are attempting to invoke has been converted to an Upstart job, you may also use the stop(8) and then start(8) utilities, e.g. stop acpid ; start acpid. The restart(8) utility is also available. acpid stop/waiting acpid start/running, process 5541 package libqtgui4 exist QT_VERSION = 4 make -C /lib/modules/3.2.0-22-generic/build M=/usr/local/bin/ztemtApp/zteusbserial/below2.6.27 modules make[1]: Entering directory /usr/src/linux-headers-3.2.0-22-generic' CC [M] /usr/local/bin/ztemtApp/zteusbserial/below2.6.27/usb-serial.o /usr/local/bin/ztemtApp/zteusbserial/below2.6.27/usb-serial.c:34:28: fatal error: linux/smp_lock.h: No such file or directory compilation terminated. make[2]: *** [/usr/local/bin/ztemtApp/zteusbserial/below2.6.27/usb-serial.o] Error 1 make[1]: *** [_module_/usr/local/bin/ztemtApp/zteusbserial/below2.6.27] Error 2 make[1]: Leaving directory/usr/src/linux-headers-3.2.0-22-generic' make: * [modules] Error 2 dpkg: error processing crossplatformui (--configure): subprocess installed post-installation script returned error exit status 2

    Read the article

  • Codestock: Apparently Powershell ain't got the power

    - by Theo Moore
    I checked on the status of voting on the Codestock (www.codestock.org) site this week. I was surpised to see that none of the Powershell sessions were among leaders in voting. Now, I confess that I am somewhat biased (my session is on Powershell), but that said, I thought it odd. I was under the impression that Powershell had a strong following and that many people were using it. I suppose the voting reflects a stronger developer community that might not make use of Powershell to degree some others might. I am a huge fan of Powershell and I am constantly impressed with the things it can do. In my case, I use it as lightweight functional testing harness for web pages. I use it in this capacity at work and for work I do for the Carbonated Comics (www.carbonatedcomics.com) site as well. If anyone still hasn't registered, do us a favor and vote for a Powershell session, K?

    Read the article

  • Running a Mongo Replica Set on Azure VM Roles

    - by Elton Stoneman
    Originally posted on: http://geekswithblogs.net/EltonStoneman/archive/2013/10/15/running-a-mongo-replica-set-on-azure-vm-roles.aspxSetting up a MongoDB Replica Set with a bunch of Azure VMs is straightforward stuff. Here’s a step-by-step which gets you from 0 to fully-redundant 3-node document database in about 30 minutes (most of which will be spent waiting for VMs to fire up). First, create yourself 3 VM roles, which is the minimum number of nodes you need for high availability. You can use any OS that Mongo supports. This guide uses Windows but the only difference will be the mechanism for starting the Mongo service when the VM starts (Windows Service, daemon etc.) While the VMs are provisioning, download and install Mongo locally, so you can set up the replica set with the Mongo shell. We’ll create our replica set from scratch, doing one machine at a time (if you have a single node you want to upgrade to a replica set, it’s the same from step 3 onwards): 1. Setup Mongo Log into the first node, download mongo and unzip it to C:. Rename the folder to remove the version – so you have c:\MongoDB\bin etc. – and create a new folder for the logs, c:\MongoDB\logs. 2. Setup your data disk When you initialize a node in a replica set, Mongo pre-allocates a whole chunk of storage to use for data replication. It will use up to 5% of your data disk, so if you use a Windows VM image with a defsault 120Gb disk and host your data on C:, then Mongo will allocate 6Gb for replication. And that takes a while. Instead you can create yourself a new partition by shrinking down the C: drive in Computer Management, by say 10Gb, and then creating a new logical disk for your data from that spare 10Gb, which will be allocated as E:. Create a new folder, e:\data. 3. Start Mongo When that’s done, start a command line, point to the mongo binaries folder, install Mongo as a Windows Service, running in replica set mode, and start the service: cd c:\mongodb\bin mongod -logpath c:\mongodb\logs\mongod.log -dbpath e:\data -replSet TheReplicaSet –install net start mongodb 4. Open the ports Mongo uses port 27017 by default, so you need to allow access in the machine and in Azure. In the VM, open Windows Firewall and create a new inbound rule to allow access via port 27017. Then in the Azure Management Console for the VM role, under the Configure tab add a new rule, again to allow port 27017. 5. Initialise the replica set Start up your local mongo shell, connecting to your Azure VM, and initiate the replica set: c:\mongodb\bin\mongo sc-xyz-db1.cloudapp.net rs.initiate() This is the bit where the new node (at this point the only node) allocates its replication files, so if your data disk is large, this can take a long time (if you’re using the default C: drive with 120Gb, it may take so long that rs.initiate() never responds. If you’re sat waiting more than 20 minutes, start another instance of the mongo shell pointing to the same machine to check on it). Run rs.conf() and you should see one node configured. 6. Fix the host name for the primary – *don’t miss this one* For the first node in the replica set, Mongo on Windows doesn’t populate the full machine name. Run rs.conf() and the name of the primary is sc-xyz-db1, which isn’t accessible to the outside world. The replica set configuration needs the full DNS name of every node, so you need to manually rename it in your shell, which you can do like this: cfg = rs.conf() cfg.members[0].host = ‘sc-xyz-db1.cloudapp.net:27017’ rs.reconfig(cfg) When that returns, rs.conf() will have your full DNS name for the primary, and the other nodes will be able to connect. At this point you have a working database, so you can start adding documents, but there’s no replication yet. 7. Add more nodes For the next two VMs, follow steps 1 through to 4, which will give you a working Mongo database on each node, which you can add to the replica set from the shell with rs.add(), using the full DNS name of the new node and the port you’re using: rs.add(‘sc-xyz-db2.cloudapp.net:27017’) Run rs.status() and you’ll see your new node in STARTUP2 state, which means its initializing and replicating from the PRIMARY. Repeat for your third node: rs.add(‘sc-xyz-db3.cloudapp.net:27017’) When all nodes are finished initializing, you will have a PRIMARY and two SECONDARY nodes showing in rs.status(). Now you have high availability, so you can happily stop db1, and one of the other nodes will become the PRIMARY with no loss of data or service. Note – the process for AWS EC2 is exactly the same, but with one important difference. On the Azure Windows Server 2012 base image, the MongoDB release for 64-bit 2008R2+ works fine, but on the base 2012 AMI that release keeps failing with a UAC permission error. The standard 64-bit release is fine, but it lacks some optimizations that are in the 2008R2+ version.

    Read the article

  • Is WCF suitable for writing an application which is shared among applications?

    - by RPK
    I have developed and deployed few ASP.NET applications. Sometimes I want to stop the users from either inserting or updating a record when: Maintenance is going on. Stop operations due to payment delay. In one of my recent application I have implemented this feature to first check the database operations for locked status. If any of the above condition fulfils, database operations like insert and update are not carried out. I now need this feature in all the old applications and the future applications I build. I want to know whether WCF is suitable in this scenario as I want to share methods or an independent locking application among various other applications. Is WCF appropriate for this type of scenario?

    Read the article

  • How to setup NX server and client?

    - by javanoob
    I installed NX server on my desktop and able to run it successfully following this tutorial: http://michigantelephone.wordpress.com/2007/10/15/how-to-install-nx-server-and-client-under-ubuntukubuntu-linux/ When i run the command sudo /usr/NX/bin/nxserver –status I am seeing the following output : NX 900 Connecting to server .. NX 110 NX Server is running. NX 999 Bye. It means i have setup NX server correctly.. But on other machine when i open NX client it is asking for hostname..what name should i give there? Every tutorial is explaining about how to install and start NX server but not about how to connect to the server from client.. Could you guys please help me? Thanks in Advance Deter

    Read the article

  • Command line method to find disk usage of camera mounted using gvfs

    - by Hamish Downer
    When my camera was mounted on /media I could use the standard tools (df) to see the disk usage of the card in my camera. However now the camera is mounted using gvfs, and df seems to ignore it. I've also tried pydf and discus to no avail. The camera is definitely available through nautilus, and when I select the camera in nautlius, the status bar tells me the amount of disk free. I can also open the ~/.gvfs/ folder in nautilus and right click on the camera folder and get the disk usage in a graphical way. But that is no use for a script. Are there command line tools that are the equivalent of df for gvfs filesystems? Or even better, a way to make df report on gvfs filesystems?

    Read the article

  • Extreme Optimization – Numerical Algorithm Support

    - by JoshReuben
    Function Delegates Many calculations involve the repeated evaluation of one or more user-supplied functions eg Numerical integration. The EO MathLib provides delegate types for common function signatures and the FunctionFactory class can generate new delegates from existing ones. RealFunction delegate - takes one Double parameter – can encapsulate most of the static methods of the System.Math class, as well as the classes in the Extreme.Mathematics.SpecialFunctions namespace: var sin = new RealFunction(Math.Sin); var result = sin(1); BivariateRealFunction delegate - takes two Double parameters: var atan2 = new BivariateRealFunction (Math.Atan2); var result = atan2(1, 2); TrivariateRealFunction delegate – represents a function takes three Double arguments ParameterizedRealFunction delegate - represents a function taking one Integer and one Double argument that returns a real number. The Pow method implements such a function, but the arguments need order re-arrangement: static double Power(int exponent, double x) { return ElementaryFunctions.Pow(x, exponent); } ... var power = new ParameterizedRealFunction(Power); var result = power(6, 3.2); A ComplexFunction delegate - represents a function that takes an Extreme.Mathematics.DoubleComplex argument and also returns a complex number. MultivariateRealFunction delegate - represents a function that takes an Extreme.Mathematics.LinearAlgebra.Vector argument and returns a real number. MultivariateVectorFunction delegate - represents a function that takes a Vector argument and returns a Vector. FastMultivariateVectorFunction delegate - represents a function that takes an input Vector argument and an output Matrix argument – avoiding object construction  The FunctionFactory class RealFromBivariateRealFunction and RealFromParameterizedRealFunction helper methods - transform BivariateRealFunction or a ParameterizedRealFunction into a RealFunction delegate by fixing one of the arguments, and treating this as a new function of a single argument. var tenthPower = FunctionFactory.RealFromParameterizedRealFunction(power, 10); var result = tenthPower(x); Note: There is no direct way to do this programmatically in C# - in F# you have partial value functions where you supply a subset of the arguments (as a travelling closure) that the function expects. When you omit arguments, F# generates a new function that holds onto/remembers the arguments you passed in and "waits" for the other parameters to be supplied. let sumVals x y = x + y     let sumX = sumVals 10     // Note: no 2nd param supplied.     // sumX is a new function generated from partially applied sumVals.     // ie "sumX is a partial application of sumVals." let sum = sumX 20     // Invokes sumX, passing in expected int (parameter y from original)  val sumVals : int -> int -> int val sumX : (int -> int) val sum : int = 30 RealFunctionsToVectorFunction and RealFunctionsToFastVectorFunction helper methods - combines an array of delegates returning a real number or a vector into vector or matrix functions. The resulting vector function returns a vector whose components are the function values of the delegates in the array. var funcVector = FunctionFactory.RealFunctionsToVectorFunction(     new MultivariateRealFunction(myFunc1),     new MultivariateRealFunction(myFunc2));  The IterativeAlgorithm<T> abstract base class Iterative algorithms are common in numerical computing - a method is executed repeatedly until a certain condition is reached, approximating the result of a calculation with increasing accuracy until a certain threshold is reached. If the desired accuracy is achieved, the algorithm is said to converge. This base class is derived by many classes in the Extreme.Mathematics.EquationSolvers and Extreme.Mathematics.Optimization namespaces, as well as the ManagedIterativeAlgorithm class which contains a driver method that manages the iteration process.  The ConvergenceTest abstract base class This class is used to specify algorithm Termination , convergence and results - calculates an estimate for the error, and signals termination of the algorithm when the error is below a specified tolerance. Termination Criteria - specify the success condition as the difference between some quantity and its actual value is within a certain tolerance – 2 ways: absolute error - difference between the result and the actual value. relative error is the difference between the result and the actual value relative to the size of the result. Tolerance property - specify trade-off between accuracy and execution time. The lower the tolerance, the longer it will take for the algorithm to obtain a result within that tolerance. Most algorithms in the EO NumLib have a default value of MachineConstants.SqrtEpsilon - gives slightly less than 8 digits of accuracy. ConvergenceCriterion property - specify under what condition the algorithm is assumed to converge. Using the ConvergenceCriterion enum: WithinAbsoluteTolerance / WithinRelativeTolerance / WithinAnyTolerance / NumberOfIterations Active property - selectively ignore certain convergence tests Error property - returns the estimated error after a run MaxIterations / MaxEvaluations properties - Other Termination Criteria - If the algorithm cannot achieve the desired accuracy, the algorithm still has to end – according to an absolute boundary. Status property - indicates how the algorithm terminated - the AlgorithmStatus enum values:NoResult / Busy / Converged (ended normally - The desired accuracy has been achieved) / IterationLimitExceeded / EvaluationLimitExceeded / RoundOffError / BadFunction / Divergent / ConvergedToFalseSolution. After the iteration terminates, the Status should be inspected to verify that the algorithm terminated normally. Alternatively, you can set the ThrowExceptionOnFailure to true. Result property - returns the result of the algorithm. This property contains the best available estimate, even if the desired accuracy was not obtained. IterationsNeeded / EvaluationsNeeded properties - returns the number of iterations required to obtain the result, number of function evaluations.  Concrete Types of Convergence Test classes SimpleConvergenceTest class - test if a value is close to zero or very small compared to another value. VectorConvergenceTest class - test convergence of vectors. This class has two additional properties. The Norm property specifies which norm is to be used when calculating the size of the vector - the VectorConvergenceNorm enum values: EuclidianNorm / Maximum / SumOfAbsoluteValues. The ErrorMeasure property specifies how the error is to be measured – VectorConvergenceErrorMeasure enum values: Norm / Componentwise ConvergenceTestCollection class - represent a combination of tests. The Quantifier property is a ConvergenceTestQuantifier enum that specifies how the tests in the collection are to be combined: Any / All  The AlgorithmHelper Class inherits from IterativeAlgorithm<T> and exposes two methods for convergence testing. IsValueWithinTolerance<T> method - determines whether a value is close to another value to within an algorithm's requested tolerance. IsIntervalWithinTolerance<T> method - determines whether an interval is within an algorithm's requested tolerance.

    Read the article

  • Is there any reason to allow Yahoo! Slurp to crawl my site?

    - by James Skemp
    I thought a year or more ago Yahoo! would be using another search engine for results, and no longer using their own Slurp bot. However, a couple of the sites I manage Yahoo! Slurp continues to crawl pages, and seems to ignore the Gone status code when returned (as it keeps coming back). Is there any reason why I wouldn't want to block Yahoo! Slurp via robots.txt or by IP (since it tends to ignore robots.txt in some cases anyways)? I've confirmed that when the bot does hit it is from Yahoo! IPs, so I believe this is a legit instance of the bot. Is Yahoo Search the same as Bing Search now? is a related question, but I don't think it completely answers whether one should add a new block of the bot.

    Read the article

  • Can't install fglrx on Kernel 3.11.0-12

    - by byf-ferdy
    I'm running Ubuntu Gnome 13.10 on a Sony Vaio laptop. I tried to install the newest fglrx driver from the cchtml.com guide but no matter which version I use (13.4 or even 13.9) the installation of the generated .debs fails with this message: Error! Bad return status for module build on kernel: 3.11.0-12-generic (x86_64) This seems to be a confirmed bug on launchpad. Currently I'm running the Radeon SI driver but I really need some 3D acceleration. What is there I can do to install any version of fglrx correctly or to speed up the bug-fixing-process? How long does the fixing of these bugs usually take?

    Read the article

  • Five C# Code Snippets

    A snippet is a small section of text or source code that can be inserted into the code of a program. Snippets provide an easy way to implement commonly used code or functions into a larger section of code. Instead of rewriting the same code over and over again, a programmer can save the code [...] Related posts:How To Obtain Environment Details With .NET 3.5 How-to: Easily Send Emails With .NET Understanding SMTP Status Codes ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • When should we use weak entities when modelling a database?

    - by Songo
    This is basically a question about what are weak entities? When should we use them? How should they be modeled? What is the main difference between normal entities and weak entities? Does weak entities correspond to value objects when doing Domain Driven Design? To help keep the question on topic here is an example taken from Wikipedia that people can use to answer these question: In this example OrderItem was modeled as a weak entity, but I can't understand why it can't be modeled as a normal entity. Another question is what if I want to track the order history (i.e. the changes in it status) would that be a normal or weak entity?

    Read the article

  • How to know which fields of a record are updated in saving the edit? [closed]

    - by Luiz Maffort
    I'm recording in a log table all that is changed in a given table, so this need to know when for example the User to change the status from active to inactive. With this information I will write in my log table information of which record was changed, by whom and what was the value of old and new. If I instantiate an object before: db.Entry(chamados).State = EntityState.Modified; I can even compare, but this error appears at runtime. *An object with the same key already exists in the ObjectStateManager. The ObjectStateManager cannot track multiple objects with the same key. * Podem me ajudar por favor?

    Read the article

  • How to organize timeline in a Flash project?

    - by miguelSantirso
    Hi, I am starting a new Flash game and I was wondering if there is a better way to organize the timeline of the project. In my previous games I define a keyframe for each possible status of the game (loading, sponsor, intro, menu, gameplay, etc...). This method works but has some problems... For instance, it is not easy to implement transitions between the different screens in the game. How do you do this? Do you know of some better way?

    Read the article

  • Ubuntu Software Center does not proceed from applying changes

    - by aneal
    I have a problem with Ubuntu software center. It is "Searching" and "applying changes" for long period of time. I tired to cancel by clicking cross(X) mark. However, it is now stuck at "cancelling". It won't let me download any new application even from terminal i guess. neal@neal-G50VT:~$ sudo apt-get install gnome-tweak-tool E: Could not get lock /var/lib/dpkg/lock - open (11: Resource temporarily unavailable) E: Unable to lock the administration directory (/var/lib/dpkg/), is another process using it? neal@neal-G50VT:~$ sudo dpkg --configure -a dpkg: error: dpkg status database is locked by another process There are similar question here, but with no answers: Software Center stuck for Dropbox Software Center freezes during “applying changes

    Read the article

  • Script to email files content

    - by Tarun
    I have created a shell script that takes backups everyday and emails its execution as successfull or unsuccessfull. Now I want that it send the contents of log file it creates with the mail as well. I have seen how to send file as attachement but I want to send the contents of the file as email message and not the file. Please Help. Its code is like #Email Settings Message_Success="Database Backup generated successfully" Message_Failure="Problem occured while generating Database Backup please verify" Subject="Database Backup Status Mail" Recipients="[email protected]" #Verify Backup Created if [ -f "$Path_Mysql_Dump" ]; then echo "Database Backup Created" >> $Path_Log_File echo "$Message_Success" | mail -s "$Subject" "$Recipients" else echo "Database Backup not created please verify the process will terminate" >> $Path_Log_File echo "$Message_Failure" | mail -s "$Subject" "$Recipients" exit -1 fi

    Read the article

  • ubuntu desktop installation problem, a lot of error messages

    - by veerendar
    Hi I am getting error messages while trying to install ubuntu 12.04 on my desktop(intel pentium D). and the error messages are: *checking battery state... [412.633532] end_request:I/O error .dev sr0, sector 1291684 [435.997503] SQUASHFS error: squashfs_readdata failed to readbloack 0x275fcda4 [435.9975xx] SQUASHFS error: unable to read fragment cache entry page [275fcda4] . . [524.000055] exception Emask 0x0 SAct 0x0SErr 0x0 action 0x6 frozen ata5.00 : cmd a0/00:00:00:08:00/00:00:00/00 tag0 pio 16392 in res 58 /00:02:00:08:00/00/a0 Emask 0xf(timeout) [524.000292] ata5.00 status : {DRDY DRQ} I was able to install on another pc(intel atom) but not on this pc(intel pentium D), Can any one help me in successful installation. Thanks!

    Read the article

  • Error installing MySQL Ubuntu 12.04, dependency?

    - by user86736
    I'm trying to install MySQL 5.5 on Ubuntu 12.04, but I'm stuck with this error: Setting up mysql-server-5.5 (5.5.24-0ubuntu0.12.04.1) ... start: Job failed to start invoke-rc.d: initscript mysql, action "start" failed. dpkg: error processing mysql-server-5.5 (--configure): subprocess installed post-installation script returned error exit status 1 dpkg: dependency problems prevent configuration of mysql-server: mysql-server depends on mysql-server-5.5; however: Package mysql-server-5.5 is not configured yet. dpkg: error processing mysql-server (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates its a followup error from a previous failure. Errors were encountered while processing: mysql-server-5.5 mysql-server E: Sub-process /usr/bin/dpkg returned an error code (1) Thanks for help!

    Read the article

  • Sequence for authentication on a decoupled client?

    - by A T
    Using a sequence diagram and example code could you explain to me how authentication works when the client is completely separated from the server? I.e.: you haven't generated any of the client using a server-side template engine, rather you are communicating using REST (SOAP xor HTTP) xor RPC (XML xor JSON) with javascript on the client-side. Specifically I would like to know the sequence of: Authenticating using basic auth (user+pass) with "my" server Authenticating using OAuth2, e.g.: with Facebook, with facebook's server then whatever extra steps are needed for "my" server And how it could be implemented. (feel free to use psuedo-code [like below] or [preferably] prototyped simply using BackboneJS, AngularJS, EmberJS, BatmanJS, AgilityJS, SammyJS xor ActiveJS. if cookie.status in [Expired, Tampered, Wrong IP, Invalid, Not Found]: try auth(user,pass): if user is in my db: try authenticate(user,pass) if successful: login user # give session-cookie here? else: present user with "auth failed" msg else if user not in db: redirect to "edit-profile" page PS: I have written an example (editable) auth sequence diagram; based on facebooks' documentation.

    Read the article

  • Synaptic returns error

    - by donvoldy666
    I get the following error message when I run synaptic and I can't install any programs SystemError: W:Ignoring file 'getdeb.list.bck' in directory '/etc/apt/sources.list.d/' as it has an invalid filename extension, W:Duplicate sources.list entry 'http://extras.ubuntu.com/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry 'http://extras.ubuntu.com/ubuntu/ ' precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry 'http://extras.ubuntu.com/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry ' 'http://extras.ubuntu.com/ubuntu'/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry' 'http://extras.ubuntu.com/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry 'http://extras.ubuntu.com/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry 'http://ppa.launchpad.net/ubuntu-mozilla-security/ppa/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/ppa.launchpad.net_ubuntu-mozilla-security_ppa_ubuntu_dists_precise_main_binary-i386_Packages), E:Encountered a section with no Package: header, E:Problem with MergeList /var/lib/apt/lists/packages.rssowl.org_ubuntu_dists_precise_main_i18n_Translation-en, E:The package lists or status file could not be parsed or opened. I'm new to Linux so please help me out .

    Read the article

  • Flightradar24 Maps Global Air Traffic in Real Time

    - by Jason Fitzpatrick
    Flightradar24 is a real time flight tracking service that shows you where thousands of planes are at any given time. Whether you’re an aviation buff or just want to show a worried kid that mom’s flight is almost home, they have you covered. Flightradar24 is a free service that tracks flights using data from the FAA and ADS-B to display the status of flights across the globe. You can filter the information to see only certain planes, planes originating from certain airports, planes at various altitudes, and more. The interface is accessible via their web site as well as via iOS and Android devices. Hit up the link below to take it for a spin. Flightradar24 How To Create a Customized Windows 7 Installation Disc With Integrated Updates How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using?

    Read the article

  • Unable to mount external hard drive

    - by arranjamesroche
    Basically I my 12.10 update crashed halfway through, so I've had to start again and put all my data onto an external HDD. It was all going fine until this came up : Error mounting /dev/sdb1 at /media/amy/CA47-8339: Command-line `mount -t "vfat" -o "uhelper=udisks2,nodev,nosuid,uid=1000,gid=1000,shortname=mixed,dmask=0077,utf8=1,showexec,flush" "/dev/sdb1" "/media/amy/CA47-8339"' exited with non-zero exit status 32: mount: wrong fs type, bad option, bad superblock on /dev/sdb1, missing codepage or helper program, or other error In some cases useful info is found in syslog - try dmesg | tail or so When I tried to restore my info off the HDD. Now I'm stuck completely clueless as to how I'll get anything off the hard drive.

    Read the article

  • Is it safe to block redirected (but still linked) URLs with robots.txt?

    - by Edgar Quintero
    I have a website that has all URLs optimized and 301 redirected from nasty URLs to clean ones. However, everywhere throughout the site the unclean URLs are linked in menus, content, products, etc. Google currently has all clean URLs indexed, along with a few unclean URLs too. So the site still has linked everywhere the old URLs (ideally this wouldn't be the case but this is how it is ATM). I would like to block the unclean URLs with robots.txt. The question: if I block these unclean URLs with the robots.txt, when the entire website is linked with them (but they all redirect to the clean version), will this affect the indexing status at all?

    Read the article

  • Reference Data Management and Master Data: Are Relation ?

    - by Mala Narasimharajan
    Submitted By:  Rahul Kamath  Oracle Data Relationship Management (DRM) has always been extremely powerful as an Enterprise Master Data Management (MDM) solution that can help manage changes to master data in a way that influences enterprise structure, whether it be mastering chart of accounts to enable financial transformation, or revamping organization structures to drive business transformation and operational efficiencies, or restructuring sales territories to enable equitable distribution of leads to sales teams following the acquisition of new products, or adding additional cost centers to enable fine grain control over expenses. Increasingly, DRM is also being utilized by Oracle customers for reference data management, an emerging solution space that deserves some explanation. What is reference data? How does it relate to Master Data? Reference data is a close cousin of master data. While master data is challenged with problems of unique identification, may be more rapidly changing, requires consensus building across stakeholders and lends structure to business transactions, reference data is simpler, more slowly changing, but has semantic content that is used to categorize or group other information assets – including master data – and gives them contextual value. In fact, the creation of a new master data element may require new reference data to be created. For example, when a European company acquires a US business, chances are that they will now need to adapt their product line taxonomy to include a new category to describe the newly acquired US product line. Further, the cross-border transaction will also result in a revised geo hierarchy. The addition of new products represents changes to master data while changes to product categories and geo hierarchy are examples of reference data changes.1 The following table contains an illustrative list of examples of reference data by type. Reference data types may include types and codes, business taxonomies, complex relationships & cross-domain mappings or standards. Types & Codes Taxonomies Relationships / Mappings Standards Transaction Codes Industry Classification Categories and Codes, e.g., North America Industry Classification System (NAICS) Product / Segment; Product / Geo Calendars (e.g., Gregorian, Fiscal, Manufacturing, Retail, ISO8601) Lookup Tables (e.g., Gender, Marital Status, etc.) Product Categories City à State à Postal Codes Currency Codes (e.g., ISO) Status Codes Sales Territories (e.g., Geo, Industry Verticals, Named Accounts, Federal/State/Local/Defense) Customer / Market Segment; Business Unit / Channel Country Codes (e.g., ISO 3166, UN) Role Codes Market Segments Country Codes / Currency Codes / Financial Accounts Date/Time, Time Zones (e.g., ISO 8601) Domain Values Universal Standard Products and Services Classification (UNSPSC), eCl@ss International Classification of Diseases (ICD) e.g., ICD9 à IC10 mappings Tax Rates Why manage reference data? Reference data carries contextual value and meaning and therefore its use can drive business logic that helps execute a business process, create a desired application behavior or provide meaningful segmentation to analyze transaction data. Further, mapping reference data often requires human judgment. Sample Use Cases of Reference Data Management Healthcare: Diagnostic Codes The reference data challenges in the healthcare industry offer a case in point. Part of being HIPAA compliant requires medical practitioners to transition diagnosis codes from ICD-9 to ICD-10, a medical coding scheme used to classify diseases, signs and symptoms, causes, etc. The transition to ICD-10 has a significant impact on business processes, procedures, contracts, and IT systems. Since both code sets ICD-9 and ICD-10 offer diagnosis codes of very different levels of granularity, human judgment is required to map ICD-9 codes to ICD-10. The process requires collaboration and consensus building among stakeholders much in the same way as does master data management. Moreover, to build reports to understand utilization, frequency and quality of diagnoses, medical practitioners may need to “cross-walk” mappings -- either forward to ICD-10 or backwards to ICD-9 depending upon the reporting time horizon. Spend Management: Product, Service & Supplier Codes Similarly, as an enterprise looks to rationalize suppliers and leverage their spend, conforming supplier codes, as well as product and service codes requires supporting multiple classification schemes that may include industry standards (e.g., UNSPSC, eCl@ss) or enterprise taxonomies. Aberdeen Group estimates that 90% of companies rely on spreadsheets and manual reviews to aggregate, classify and analyze spend data, and that data management activities account for 12-15% of the sourcing cycle and consume 30-50% of a commodity manager’s time. Creating a common map across the extended enterprise to rationalize codes across procurement, accounts payable, general ledger, credit card, procurement card (P-card) as well as ACH and bank systems can cut sourcing costs, improve compliance, lower inventory stock, and free up talent to focus on value added tasks. Change Management: Point of Sales Transaction Codes and Product Codes In the specialty finance industry, enterprises are confronted with usury laws – governed at the state and local level – that regulate financial product innovation as it relates to consumer loans, check cashing and pawn lending. To comply, it is important to demonstrate that transactions booked at the point of sale are posted against valid product codes that were on offer at the time of booking the sale. Since new products are being released at a steady stream, it is important to ensure timely and accurate mapping of point-of-sale transaction codes with the appropriate product and GL codes to comply with the changing regulations. Multi-National Companies: Industry Classification Schemes As companies grow and expand across geographies, a typical challenge they encounter with reference data represents reconciling various versions of industry classification schemes in use across nations. While the United States, Mexico and Canada conform to the North American Industry Classification System (NAICS) standard, European Union countries choose different variants of the NACE industry classification scheme. Multi-national companies must manage the individual national NACE schemes and reconcile the differences across countries. Enterprises must invest in a reference data change management application to address the challenge of distributing reference data changes to downstream applications and assess which applications were impacted by a given change. References 1 Master Data versus Reference Data, Malcolm Chisholm, April 1, 2006.

    Read the article

  • Is LightDM is broken? Autologin works

    - by Ben
    I recently upgraded my Mythbuntu box to 11.10. LightDM worked for a while and then I configured autologin on my user account. After some time, I've removed some packages to lean down the system. Now when I log out (from either Unity or Gnome Shell) I lose the X session, LightDM doesn't start back up. If I disable autologin through /etc/lightdm/lightdm.conf I get no login screen & no X. As I said, autologin works and gives me a Ubuntu Classic desktop. This is the only reference in dmesg to lightdm; [ 17.351023] type=1400 audit(1329530135.420:19): apparmor="STATUS" operation="profile_load" name="/usr/lib/lightdm/lightdm-guest-session-wrapper" pid=1097 comm="apparmor_parser" Can anyone help?

    Read the article

< Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >