Search Results

Search found 94348 results on 3774 pages for 'system data oracleclient'.

Page 31/3774 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • How do you encode Algebraic Data Types in a C#- or Java-like language?

    - by Jörg W Mittag
    There are some problems which are easily solved by Algebraic Data Types, for example a List type can be very succinctly expressed as: data ConsList a = Empty | ConsCell a (ConsList a) consmap f Empty = Empty consmap f (ConsCell a b) = ConsCell (f a) (consmap f b) l = ConsCell 1 (ConsCell 2 (ConsCell 3 Empty)) consmap (+1) l This particular example is in Haskell, but it would be similar in other languages with native support for Algebraic Data Types. It turns out that there is an obvious mapping to OO-style subtyping: the datatype becomes an abstract base class and every data constructor becomes a concrete subclass. Here's an example in Scala: sealed abstract class ConsList[+T] { def map[U](f: T => U): ConsList[U] } object Empty extends ConsList[Nothing] { override def map[U](f: Nothing => U) = this } final class ConsCell[T](first: T, rest: ConsList[T]) extends ConsList[T] { override def map[U](f: T => U) = new ConsCell(f(first), rest.map(f)) } val l = (new ConsCell(1, new ConsCell(2, new ConsCell(3, Empty))) l.map(1+) The only thing needed beyond naive subclassing is a way to seal classes, i.e. a way to make it impossible to add subclasses to a hierarchy. How would you approach this problem in a language like C# or Java? The two stumbling blocks I found when trying to use Algebraic Data Types in C# were: I couldn't figure out what the bottom type is called in C# (i.e. I couldn't figure out what to put into class Empty : ConsList< ??? >) I couldn't figure out a way to seal ConsList so that no subclasses can be added to the hierarchy What would be the most idiomatic way to implement Algebraic Data Types in C# and/or Java? Or, if it isn't possible, what would be the idiomatic replacement?

    Read the article

  • How to Convert multiple sets of Data going from left to right to top to bottom the Pythonic way?

    - by ThinkCode
    Following is a sample of sets of contacts for each company going from left to right. ID Company ContactFirst1 ContactLast1 Title1 Email1 ContactFirst2 ContactLast2 Title2 Email2 1 ABC John Doe CEO [email protected] Steve Bern CIO [email protected] How do I get them to go top to bottom as shown? ID Company Contactfirst ContactLast Title Email 1 ABC John Doe CEO [email protected] 1 ABC Steve Bern CIO [email protected] I am hoping there is a Pythonic way of solving this task. Any pointers or samples are really appreciated! p.s : In the actual file, there are 10 sets of contacts going from left to right and there are few thousand such records. It is a CSV file and I loaded into MySQL to manipulate the data.

    Read the article

  • How to add data manually in core data entity

    - by pankaj
    Hi I am working on core data for the first time. I have just created an entity and attributes for that entity. I want to add some data inside the entity(u can say i want to add data in a table), earlier i when i was using sqlite, i would add data using terminal. But here in core data i am not able to find a place where i can manually add data. I just want to add data in entity and display it in a UITableView. I have gone through the the documentation of core data but it does not explain how to add data manually although it explains how i can add it programmiticaly but i dont need to do it programically. I want to do it manually. Thanks in advance

    Read the article

  • How do you verify the correct data is in a data mart?

    - by blockcipher
    I'm working on a data warehouse and I'm trying to figure out how to best verify that data from our data cleansing (normalized) database makes it into our data marts correctly. I've done some searches, but the results so far talk more about ensuring things like constraints are in place and that you need to do data validation during the ETL process (E.g. dates are valid, etc.). The dimensions were pretty easy as I could easily either leverage the primary key or write a very simple and verifiable query to get the data. The fact tables are more complex. Any thoughts? We're trying to make this very easy for a subject matter export to run a couple queries, see some data from both the data cleansing database and the data marts, and visually compare the two to ensure they are correct.

    Read the article

  • How do I set default values on new properties for existing entities after light weight core data migration?

    - by Moritz
    I've successfully completed light weight migration on my core data model. My custom entity Vehicle received a new property 'tirePressure' which is an optional property of type double with the default value 0.00. When 'old' Vehicles are fetched from the store (Vehicles that were created before the migration took place) the value for their 'tirePressure' property is nil. (Is that expected behavior?) So I thought: "No problem, I'll just do this in the Vehicle class:" - (void)awakeFromFetch { [super awakeFromFetch]; if (nil == self.tirePressure) { [self willChangeValueForKey:@"tirePressure"]; self.tirePressure = [NSNumber numberWithDouble:0.0]; [self didChangeValueForKey:@"tirePressure"]; } } Since "change processing is explicitly disabled around" awakeFromFetch I thought the calls to willChangeValueForKey and didChangeValueForKey would mark 'tirePresure' as dirty. But they don't. Every time these Vehicles are fetched from the store 'tirePressure' continues to be nil despite having saved the context.

    Read the article

  • Select data from three different tables with null data

    - by user3678972
    I am new in Sql. My question is how to get data from three different tables with null values. I have tried a query as below: SELECT * FROM [USER] JOIN [Location] ON ([Location].UserId = [USER].Id) JOIN [ParentChild] ON ([ParentChild].UserId = [USER].Id) WHERE ParentId=7 which I find from this link. Its working fine but, it not fetches all and each data associated with the ParentId Something like it only fetches data which are available in all tables, but also omits some data which not available in Location tables but it comes under the given ParentId. For example: UserId ParentId 1 7 8 7 For userId 8, there is data available in Location table,so it fetches all data. But there is no data for userId 1 available in Location table, so the query didn't work for this. But I want all and every data. If there is no data for userId then it can return only null columns. Is it possible ?? hope everyone can understand my problem.

    Read the article

  • How does Core Data determine if an NSObjects data can be dropped?

    - by Kevin
    In the app I am working on now I was storing about 500 images in Core Data. I have since pulled those images out and store them in the file system now, but in the process I found that the app would crash on the device if I had an array of 500 objects with image data in them. An array with 500 object ids with the image data in those objects worked fine. The 500 objects without the image data also worked fine. I found that I got the best performance with both an array of object ids and image data stored on the filesystem instead of in core data. The conclusion I came to was that if I had an object in an array that told Core Data I was "using" that object and Core Data would hold on to the data. Is this correct?

    Read the article

  • How to handle encryption key conflicts when synchronizing data?

    - by Rafael
    Assume that there is data that gets synchronized between several devices. The data is protected with a symmetric encryption algorithm and a key. The key is stored on each device and encrypted with a password. When a user changes the password only the key gets re-encrypted. Under normal circumstances, when there is a good network connection to other peers, the current key gets synchronized and all data on the new device gets encrypted with the same key. But how to handle situations where a new device doesn’t have a network connection and e.g. creates its own new, but incompatible key? How to keep the usability as high as possible under such circumstances? The application could detect that there is no network and hence refuse to start. That’s very bad usability in my opinion, because the application isn’t functional at all in this case. I don’t consider this a solution. The application could ignore the missing network connection and create a new key. But what to do when the application gains a network connection? There will be several incompatible keys and some parts of the underlying data could only be encrypted with one key and other parts with another key. The situation would get worse if there would be more keys than just two and the application would’ve to ask every time for a password when another object that should get decrypted with another key would be needed. It is very messy and time consuming to try to re-encrypt all data that is encrypted with another key with a main key. What should be the main key at all in this case? The oldest key? The key with the most encrypted objects? What if the key got synchronized but not all objects that got encrypted with this particular key? How should the user know for which particular password the application asks and why it takes probably very long to re-encrypt the data? It’s very hard to describe encryption “issues” to users. So far I didn’t find an acceptable solution, nor some kind of generic strategy. Do you have some hints about a concrete strategy or some books / papers that describe synchronization of symmetrically encrypted data with keys that could cause conflicts?

    Read the article

  • Exception using SQLiteDataReader

    - by galford13x
    I'm making a Custom SQLite Wrapper. This is meant to allow a presistent connection to a database. However, I receive an exception when calling this function twice. public Boolean DatabaseConnected(string databasePath) { bool exists = false; if (ConnectionOpen()) { this.Command.CommandText = string.Format(DATABASE_QUERY); using (reader = this.Command.ExecuteReader()) { while (reader.Read()) { if (string.Compare(reader[FILE_NAME_COL_HEADER].ToString(), databasePath, true) == 0) { exists = true; break; } } reader.Close(); } } return exists; } I use the above function to check if the database is currently open before executing a command or trying to open a database. The first time I execute the function, it executes with no issue. After that the reader = this.Command.ExecuteReader() throws an exception Object reference not set to an instance of an object. StackTrace: at System.Data.SQLite.SQLiteStatement.Dispose() at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt) at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt) at System.Data.SQLite.SQLiteDataReader.NextResult() at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave) at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior) at System.Data.SQLite.SQLiteCommand.ExecuteReader() at EveTraderApi.Database.SQLDatabase.DatabaseConnected(String databasePath) in C:\Documents and Settings\galford13x\My Documents\Visual Studio 2008\Projects\EveTrader\EveTraderApi\Database\Database.cs:line 579 at EveTraderApi.Database.SQLDatabase.OpenSQLiteDB(String filename) in C:\Documents and Settings\galford13x\My Documents\Visual Studio 2008\Projects\EveTrader\EveTraderApi\Database\Database.cs:line 119 at EveTraderApiExample.Form1.CreateTableDataTypes() in C:\Documents and Settings\galford13x\My Documents\Visual Studio 2008\Projects\EveTrader\EveTraderApiExample\Form1.cs:line 89 at EveTraderApiExample.Form1.Button1_ExecuteCommand(Object sender, EventArgs e) in C:\Documents and Settings\galford13x\My Documents\Visual Studio 2008\Projects\EveTrader\EveTraderApiExample\Form1.cs:line 35 at System.Windows.Forms.Control.OnClick(EventArgs e) at System.Windows.Forms.Button.OnClick(EventArgs e) at System.Windows.Forms.Button.OnMouseUp(MouseEventArgs mevent) at System.Windows.Forms.Control.WmMouseUp(Message& m, MouseButtons button, Int32 clicks) at System.Windows.Forms.Control.WndProc(Message& m) at System.Windows.Forms.ButtonBase.WndProc(Message& m) at System.Windows.Forms.Button.WndProc(Message& m) at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m) at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m) at System.Windows.Forms.NativeWindow.DebuggableCallback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam) at System.Windows.Forms.UnsafeNativeMethods.DispatchMessageW(MSG& msg) at System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(Int32 dwComponentID, Int32 reason, Int32 pvLoopData) at System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(Int32 reason, ApplicationContext context) at System.Windows.Forms.Application.ThreadContext.RunMessageLoop(Int32 reason, ApplicationContext context) at System.Windows.Forms.Application.Run(Form mainForm) at EveTraderApiExample.Program.Main() in C:\Documents and Settings\galford13x\My Documents\Visual Studio 2008\Projects\EveTrader\EveTraderApiExample\Program.cs:line 18 at System.AppDomain._nExecuteAssembly(Assembly assembly, String[] args) at System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args) at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly() at System.Threading.ThreadHelper.ThreadStart_Context(Object state) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ThreadHelper.ThreadStart()

    Read the article

  • Data Protection Manager System Protection Backups Failing

    - by TrueDuality
    I'm just starting to setup DPM 2010 in a test environment with a Domain Controller and a File Server. Everything seem to be working fairly well and I can get all of my backup jobs to succeed except for the "Computer\System Protection" backups. Both servers are running fully up to date 64 bit Windows Server 2008 R2 Enterprise with Service Pack 1. The error that is being provided is: DPM cannot create a backup because Windows Server Backup (WSB) on the protected computer encountered an error (WSB Event ID: 517, WSB Error Code: 0x8078001D). (ID 30229 Details: Internal error code: 0x809909FB) This Microsoft Knowledge Base article describes the issue perfectly and provides a hotfix. I downloaded the hotfix, moved it onto the affected server, attempt to run it and receive the following error: The update is not applicable to your computer. I've verified that I have indeed downloaded the 64 bit version. According to this thread the hotfix got rolled into Service Pack 1, yet I'm still experiencing the issue. Both machines do have the Windows Server Backup feature installed. Can anybody point me in the right direction? What am I missing?

    Read the article

  • Moving data files failing

    - by Miles Hayler
    Trying to migrate data from C: to D: via the SBS console is failing. The wizard starts running but drops out in the first few seconds. I'll post the full logs, but the important lines appear to be as follows: An exception of type 'Type: System.IO.FileNotFoundException, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has occurred. Message: The system cannot find the file specified. (Exception from HRESULT: 0x80070002) Stack: at TaskScheduler.TaskSchedulerClass.GetFolder(String Path) at Microsoft.WindowsServerSolutions.Common.WindowsTaskScheduler..ctor(String taskPath, String taskName) BaseException: Microsoft.WindowsServerSolutions.Storage.Common.StorageException: GetServerBackupTaskStatus: fail to find the task --- ErrorCode:0 I've been googling for days with no luck. I have found that mscorlib is a component of .net, and I've discovered multiple instances of the file in %windir%, %windir%\winsxs, %windir%\Microsoft.net Anyone come across and fixed this one before? --------------------------------------------------------- [1516] 110315.190856.1105: Storage: Initializing...C:\Program Files\Windows Small Business Server\Bin\MoveData.exe [1516] 110315.190856.2875: Storage: Data Store to be moved: Exchange [1516] 110315.190856.5305: TaskScheduler: Exception System.IO.FileNotFoundException: [1516] 110315.190856.5605: Exception: --------------------------------------- An exception of type 'Type: System.IO.FileNotFoundException, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has occurred. Timestamp: 03/15/2011 19:08:56 Message: The system cannot find the file specified. (Exception from HRESULT: 0x80070002) Stack: at TaskScheduler.TaskSchedulerClass.GetFolder(String Path) at Microsoft.WindowsServerSolutions.Common.WindowsTaskScheduler..ctor(String taskPath, String taskName) [1516] 110315.190856.5625: Storage: Exception Microsoft.WindowsServerSolutions.Common.WindowsTaskSchedulerException: [1516] 110315.190856.5635: Exception: --------------------------------------- [b]An exception of type 'Type: Microsoft.WindowsServerSolutions.Common.WindowsTaskSchedulerException, Common, Version=6.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' has occurred.[/b] Timestamp: 03/15/2011 19:08:56 Message: Failed to find the task path Stack: at Microsoft.WindowsServerSolutions.Common.WindowsTaskScheduler..ctor(String taskPath, String taskName) at Microsoft.WindowsServerSolutions.Storage.Common.ServerBackupUtility.GetServerBackupTaskStatus() --------------------------------------- An exception of type 'Type: System.IO.FileNotFoundException, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has occurred. Timestamp: 03/15/2011 19:08:56 Message: The system cannot find the file specified. (Exception from HRESULT: 0x80070002) Stack: at TaskScheduler.TaskSchedulerClass.GetFolder(String Path) at Microsoft.WindowsServerSolutions.Common.WindowsTaskScheduler..ctor(String taskPath, String taskName) [1516] 110315.190856.5665: Storage: Error Retrieving Server Backup Task Status: ErrorCode:0 BaseException: Microsoft.WindowsServerSolutions.Storage.Common.StorageException: GetServerBackupTaskStatus: fail to find the task ---> ErrorCode:0 BaseException: Microsoft.WindowsServerSolutions.Common.WindowsTaskSchedulerException: Failed to find the task path ---> System.IO.FileNotFoundException: The system cannot find the file specified. (Exception from HRESULT: 0x80070002) at TaskScheduler.TaskSchedulerClass.GetFolder(String Path) at Microsoft.WindowsServerSolutions.Common.WindowsTaskScheduler..ctor(String taskPath, String taskName) --- End of inner exception stack trace --- at Microsoft.WindowsServerSolutions.Common.WindowsTaskScheduler..ctor(String taskPath, String taskName) at Microsoft.WindowsServerSolutions.Storage.Common.ServerBackupUtility.GetServerBackupTaskStatus() --- End of inner exception stack trace --- at Microsoft.WindowsServerSolutions.Storage.Common.ServerBackupUtility.GetServerBackupTaskStatus() at Microsoft.WindowsServerSolutions.Storage.MoveData.Helper.get_ServerBackupTaskState() [1516] 110315.190857.6216: Storage: Backup Task State: Unknown [1516] 110315.190857.9347: Storage: Launching the Move Data Wizard! [1516] 110315.190857.9397: Wizard: Admin:QueryNextPage(null) = Storage.MoveDataWizard.GettingStartedPage [1516] 110315.190857.9417: Wizard: TOC Storage.MoveDataWizard.GettingStartedPage is on ExpectedPath [1516] 110315.190857.9577: Wizard: Storage.MoveDataWizard.GettingStartedPage entered [1516] 110315.190857.9657: Wizard: Admin:QueryNextPage(Storage.MoveDataWizard.GettingStartedPage) = Storage.MoveDataWizard.DiagnoseDataStorePage [1516] 110315.190857.9657: Wizard: TOC Storage.MoveDataWizard.DiagnoseDataStorePage is on ExpectedPath [1516] 110315.190857.9657: Wizard: Admin:QueryNextPage(Storage.MoveDataWizard.DiagnoseDataStorePage) = Storage.MoveDataWizard.NewDataStoreLocationPage [1516] 110315.190857.9657: Wizard: TOC Storage.MoveDataWizard.NewDataStoreLocationPage is on ExpectedPath [1516] 110315.190857.9657: Wizard: Admin:QueryNextPage(Storage.MoveDataWizard.NewDataStoreLocationPage) = null [1516] 110315.190857.9697: Wizard: ---------------------------------- [1516] 110315.190857.9697: Wizard: The pages visted: [1516] 110315.190857.9697: Wizard: Current Page := [TOC Storage.MoveDataWizard.GettingStartedPage] [1516] 110315.190857.9697: Wizard: [TOC] : TOC Storage.MoveDataWizard.DiagnoseDataStorePage [1516] 110315.190857.9697: Wizard: [TOC] : TOC Storage.MoveDataWizard.NewDataStoreLocationPage [1516] 110315.190857.9697: Wizard: Step 1 of 3 [1516] 110315.190907.0406: Wizard: Admin:QueryNextPage(Storage.MoveDataWizard.GettingStartedPage) = Storage.MoveDataWizard.DiagnoseDataStorePage [1516] 110315.190907.0416: Wizard: Storage.MoveDataWizard.GettingStartedPage exited with the button: Next [1516] 110315.190907.0416: WizardChainEngine Next Clicked: Going to page {0}.: Storage.MoveDataWizard.DiagnoseDataStorePage [1516] 110315.190907.0496: Wizard: Storage.MoveDataWizard.DiagnoseDataStorePage entered [1516] 110315.190907.0606: Wizard: Admin:QueryNextPage(Storage.MoveDataWizard.DiagnoseDataStorePage) = Storage.MoveDataWizard.NewDataStoreLocationPage [1516] 110315.190907.0606: Wizard: Admin:QueryNextPage(Storage.MoveDataWizard.NewDataStoreLocationPage) = null [1516] 110315.190907.0606: Wizard: ---------------------------------- [1516] 110315.190907.0606: Wizard: The pages visted: [1516] 110315.190907.0606: Wizard: [TOC] visited: TOC Storage.MoveDataWizard.GettingStartedPage [1516] 110315.190907.0606: Wizard: Current Page := [TOC Storage.MoveDataWizard.DiagnoseDataStorePage] [1516] 110315.190907.0616: Wizard: [TOC] : TOC Storage.MoveDataWizard.NewDataStoreLocationPage [1516] 110315.190907.0616: Wizard: Step 2 of 3 [19772] 110315.190907.0656: Storage: Starting System Diagnosis [19772] 110315.190907.0656: Storage: Getting Data Store Information [19772] 110315.190907.1086: Storage: Create the list of storage and DB directory path [19772] 110315.190907.1246: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingTasks..ctor [19772] 110315.190907.1546: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingTasks.Initialize [19772] 110315.190907.1596: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.Initialize [19772] 110315.190907.1606: Messaging: Exchange install path: C:\Program Files\Microsoft\Exchange Server\bin [19772] 110315.190908.4157: Messaging: E12 Monad runspace created ID: Microsoft.PowerShell [19772] 110315.190908.4237: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190908.4287: Messaging: Executed management shell command: get-exchangeserver [19772] 110315.190910.2369: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190910.2369: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.Initialize [19772] 110315.190910.5699: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingTasks.GatherAdminInfo [19772] 110315.190910.5699: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190910.5719: Messaging: Executed management shell command: get-user -Identity "dmagroup.local\Administrator" [19772] 110315.190911.0870: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.0880: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.0880: Messaging: Executed management shell command: get-mailbox -Identity "d2ae2bf0-48a7-4ce9-9e72-bb3c765454ac" [19772] 110315.190911.1300: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.1310: Messaging: User Administrator is mail enabled and can use MessagingManagement to send mail. [19772] 110315.190911.1310: Messaging: Email address used for user: [email protected] [19772] 110315.190911.1440: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.1440: Messaging: Executed management shell command: get-group -Identity "Domain Admins" [19772] 110315.190911.1630: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.1640: Messaging: User Administrator is a member of Domain Admins and can use MessagingManagement to manage Exchange. [19772] 110315.190911.1640: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingTasks.GatherAdminInfo [19772] 110315.190911.1640: Messaging: MessagingManagement enabled for Exchange management: True [19772] 110315.190911.1640: Messaging: MessagingManagement enabled for mail submission: True [19772] 110315.190911.1640: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingTasks.Initialize [19772] 110315.190911.1640: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Tasks.TaskMoveExchangeData.CreateDataStoreDriveList [19772] 110315.190911.1670: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.Initialize [19772] 110315.190911.1670: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.1670: Messaging: Executed management shell command: get-storagegroup -Server "SERVER01" [19772] 110315.190911.2990: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.3070: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.Initialize [19772] 110315.190911.3070: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.3070: Messaging: Executed management shell command: get-mailboxdatabase -Server "SERVER01" [19772] 110315.190911.4440: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.4520: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.Initialize [19772] 110315.190911.4520: Messaging: Begin Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.4520: Messaging: Executed management shell command: get-publicfolderdatabase -Server "SERVER01" [19772] 110315.190911.5240: Messaging: End Microsoft.WindowsServerSolutions.Messaging.Management.MessagingRunspace.StaticExecute [19772] 110315.190911.5510: Storage: Data Store Drive/s Details:Name=C:\,Size=12675712420 [19772] 110315.190911.5510: Storage: Data Store Size Details: Current Total Size=12675712420 Required Size=12675712420 [19772] 110315.190911.5510: Storage: MoveData Task can move the Data Store=True [19772] 110315.190911.8401: Storage: An error was encountered when performing system diagnosis : ErrorCode:0 BaseException: Microsoft.WindowsServerSolutions.Storage.Common.StorageException: WMI error occurred while accessing drive ---> System.Management.ManagementException: Not found at System.Management.ManagementException.ThrowWithExtendedInfo(ManagementStatus errorCode) at System.Management.ManagementObjectCollection.ManagementObjectEnumerator.MoveNext() at Microsoft.WindowsServerSolutions.Storage.Common.DriveUtil.IsDriveRemovable(String drive) --- End of inner exception stack trace --- at Microsoft.WindowsServerSolutions.Storage.Common.DriveUtil.IsDriveRemovable(String drive) at Microsoft.WindowsServerSolutions.Storage.Common.DataStoreInfo.LoadAvailableDrives() at Microsoft.WindowsServerSolutions.Storage.Common.MoveDataUtil.CanMoveData(DataStoreInfo storeInfo, MoveDataError& error) at Microsoft.WindowsServerSolutions.Storage.MoveData.DiagnoseDataStorePagePresenter.DiagnoseDataStore(Object sender, DoWorkEventArgs args) [1516] 110315.190912.0331: Storage: An error occured during the execution: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> ErrorCode:0 BaseException: Microsoft.WindowsServerSolutions.Storage.Common.StorageException: Diagnosing the Data Store failed (see the inner exception) ---> ErrorCode:0 BaseException: Microsoft.WindowsServerSolutions.Storage.Common.StorageException: WMI error occurred while accessing drive ---> System.Management.ManagementException: Not found at System.Management.ManagementException.ThrowWithExtendedInfo(ManagementStatus errorCode) at System.Management.ManagementObjectCollection.ManagementObjectEnumerator.MoveNext() at Microsoft.WindowsServerSolutions.Storage.Common.DriveUtil.IsDriveRemovable(String drive) --- End of inner exception stack trace --- at Microsoft.WindowsServerSolutions.Storage.Common.DriveUtil.IsDriveRemovable(String drive) at Microsoft.WindowsServerSolutions.Storage.Common.DataStoreInfo.LoadAvailableDrives() at Microsoft.WindowsServerSolutions.Storage.Common.MoveDataUtil.CanMoveData(DataStoreInfo storeInfo, MoveDataError& error) at Microsoft.WindowsServerSolutions.Storage.MoveData.DiagnoseDataStorePagePresenter.DiagnoseDataStore(Object sender, DoWorkEventArgs args) at System.ComponentModel.BackgroundWorker.WorkerThreadStart(Object argument) --- End of inner exception stack trace --- at Microsoft.WindowsServerSolutions.Storage.MoveData.DiagnoseDataStorePagePresenter.backgroundWorker_RunWorkerCompleted(Object sender, RunWorkerCompletedEventArgs e) --- End of inner exception stack trace --- at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner) at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks) at System.Delegate.DynamicInvokeImpl(Object[] args) at System.Windows.Forms.Control.InvokeMarshaledCallbackDo(ThreadMethodEntry tme) at System.Windows.Forms.Control.InvokeMarshaledCallbackHelper(Object obj) at System.Threading.ExecutionContext.runTryCode(Object userData) at System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Windows.Forms.Control.InvokeMarshaledCallback(ThreadMethodEntry tme) at System.Windows.Forms.Control.InvokeMarshaledCallbacks() at System.Windows.Forms.Control.WndProc(Message& m) at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m) at System.Windows.Forms.NativeWindow.DebuggableCallback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam) at System.Windows.Forms.UnsafeNativeMethods.DispatchMessageW(MSG& msg) at System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(Int32 dwComponentID, Int32 reason, Int32 pvLoopData) at System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(Int32 reason, ApplicationContext context) at System.Windows.Forms.Application.ThreadContext.RunMessageLoop(Int32 reason, ApplicationContext context) at Microsoft.WindowsServerSolutions.Common.Wizards.Framework.WizardFrameView.Create() at Microsoft.WindowsServerSolutions.Common.Wizards.Framework.WizardChainEngine.Launch() at Microsoft.WindowsServerSolutions.Storage.MoveData.MainClass.LaunchMoveDataWizard() at Microsoft.WindowsServerSolutions.Storage.MoveData.MainClass.Main(String[] args)

    Read the article

  • Lenovo System Update Breaks Windows Live

    - by wolfvilleian
    Hey everyone, I've been racking my brain (and fingers from typing) trying to solve this issue to no avail. I have a Lenovo computer and I install their system update tool to install all my missing drivers. However after this tool is installed Windows Live 2011 breaks, it will no longer sign in giving error number 8e5e0247 all the solutions online haven't helped. It appears that a language setting somewhere gets set to en_ms, and I'm en_ca. My computer is running Windows 7 x64. When i try to sign onto messenger it gives an error that (with some research) means your locale or language is not supported, I've searched my computer for any reference to en_ms but find none. Also a few other things seem to have broken, When a UAC box comes up it is no longer able to identify the publisher of anything, and also the indexing service does not work (I'm not sure if the indexing issue is related, but the UAC issue happened right after installation), I had this issue before but I don't remember how I fixed it, I believe it had something to do with environmental variables. When it goes to sign in it gets as far as the "Loading contacts" then stops and goes back to the sign in screen. Has anyone seen this before? Thanks

    Read the article

  • How to switch from Core Data automatic lightweight migration to manual?

    - by Jaanus
    My situation is similar to this question. I am using lightweight migration with the following code, fairly vanilla from Apple docs and other SO threads. It runs upon app startup when initializing the Core Data stack. NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption, [NSNumber numberWithBool:YES], NSInferMappingModelAutomaticallyOption, nil]; NSError *error = nil; NSString *storeType = nil; if (USE_SQLITE) { // app configuration storeType = NSSQLiteStoreType; } else { storeType = NSBinaryStoreType; } persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]]; // the following line sometimes crashes on app startup if (![persistentStoreCoordinator addPersistentStoreWithType:storeType configuration:nil URL:[self persistentStoreURL] options:options error:&error]) { // handle the error } For some users, especially with slower devices, I have crashes confirmed by logs at the indicated line. I understand that a fix is to switch this to manual mapping and migration. What is the recipe to do that? The long way for me would be to go through all Apple docs, but I don't recall there being good examples and tutorials specifically for schema migration.

    Read the article

  • Core data migration failing with "Can't find model for source store" but managedObjectModel for source is present

    - by Ira Cooke
    I have a cocoa application using core-data, which is now at the 4th version of its managed object model. My managed object model contains abstract entities but so far I have managed to get migration working by creating appropriate mapping models and creating my persistent store using addPersistentStoreWithType:configuration:options:error and with the NSMigratePersistentStoresAutomaticallyOption set to YES. NSDictionary *optionsDictionary = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:NSMigratePersistentStoresAutomaticallyOption]; NSURL *url = [NSURL fileURLWithPath: [applicationSupportFolder stringByAppendingPathComponent: @"MyApp.xml"]]; NSError *error=nil; [theCoordinator addPersistentStoreWithType:NSXMLStoreType configuration:nil URL:url options:optionsDictionary error:&error] This works fine when I migrate from model version 3 to 4, which is a migration that involves adding attributes to several entities. Now when I try to add a new model version (version 5), the call to addPersistentStoreWithType returns nil and the error remains empty. The migration from 4 to 5 involves adding a single attribute. I am struggling to debug the problem and have checked all the following; The source database is in fact at version 4 and the persistentStoreCoordinator's managed object model is at version 5. The 4-5 mapping model as well as managed object models for versions 4 and 5 are present in the resources folder of my built application. I've tried various model upgrade paths. Strangely I find that upgrading from an early version 3 - 5 works .. but upgrading from 4 - 5 fails. I've tried adding a custom entity migration policy for migration of the entity whose attributes are changing ... in this case I overrode the method beginEntityMapping:manager:error: . Interestingly this method does get called when migration works (ie when I migrate from 3 to 4, or from 3 to 5 ), but it does not get called in the case that fails ( 4 to 5 ). I'm pretty much at a loss as to where to proceed. Any ideas to help debug this problem would be much appreciated.

    Read the article

  • Bridging the Gap in Cloud, Big Data, and Real-time

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} With all the buzz of around big data and cloud computing, it is easy to overlook one of your most precious commodities—your data. Today’s businesses cannot stand still when it comes to data. Market success now depends on speed, volume, complexity, and keeping pace with the latest data integration breakthroughs. Are you up to speed with big data, cloud integration, real-time analytics? Join us in this three part blog series where we’ll look at each component in more detail. Meet us online on October 24th where we’ll take your questions about what issues you are facing in this brave new world of integration. Let’s start first with Cloud. What happens with your data when you decide to implement a private cloud architecture? Or public cloud? Data integration solutions play a vital role migrating data simply, efficiently, and reliably to the cloud; they are a necessary ingredient of any platform as a service strategy because they support cloud deployments with data-layer application integration between on-premise and cloud environments of all kinds. For private cloud architectures, consolidation of your databases and data stores is an important step to take to be able to receive the full benefits of cloud computing. Private cloud integration requires bidirectional replication between heterogeneous systems to allow you to perform data consolidation without interrupting your business operations. In addition, integrating data requires bulk load and transformation into and out of your private cloud is a crucial step for those companies moving to private cloud. In addition, the need for managing data services as part of SOA/BPM solutions that enable agile application delivery and help build shared data services for organizations. But what about public Cloud? If you have moved your data to a public cloud application, you may also need to connect your on-premise enterprise systems and the cloud environment by moving data in bulk or as real-time transactions across geographies. For public and private cloud architectures both, Oracle offers a complete and extensible set of integration options that span not only data integration but also service and process integration, security, and management. For those companies investing in Oracle Cloud, you can move your data through Oracle SOA Suite using REST APIs to Oracle Messaging Cloud Service —a new service that lets applications deployed in Oracle Cloud securely and reliably communicate over Java Messaging Service . As an example of loading and transforming data into other public clouds, Oracle Data Integrator supports a knowledge module for Salesforce.com—now available on AppExchange. Other third-party knowledge modules are being developed by customers and partners every day. To learn more about how to leverage Oracle’s Data Integration products for Cloud, join us live: Data Integration Breakthroughs Webcast on October 24th 10 AM PST.

    Read the article

  • How to Secure a Data Role by Multiple Business Units

    - by Elie Wazen
    In this post we will see how a Role can be data secured by multiple Business Units (BUs).  Separate Data Roles are generally created for each BU if a corresponding data template generates roles on the basis of the BU dimension. The advantage of creating a policy with a rule that includes multiple BUs is that while mapping these roles in HCM Role Provisioning Rules, fewer number of entires need to be made. This could facilitate maintenance for enterprises with a large number of Business Units. Note: The example below applies as well if the securing entity is Inventory Organization. Let us take for example the case of a user provisioned with the "Accounts Payable Manager - Vision Operations" Data Role in Fusion Applications. This user will be able to access Invoices in Vision Operations but will not be able to see Invoices in Vision Germany. Figure 1. A User with a Data Role restricting them to Data from BU: Vision Operations With the role granted above, this is what the user will see when they attempt to select Business Units while searching for AP Invoices. Figure 2.The List Of Values of Business Units is limited to single one. This is the effect of the Data Role granted to that user as can be seen in Figure 1 In order to create a data role that secures by multiple BUs,  we need to start by creating a condition that groups those Business Units we want to include in that data role. This is accomplished by creating a new condition against the BU View .  That Condition will later be used to create a data policy for our newly created Role.  The BU View is a Database resource and  is accessed from APM as seen in the search below Figure 3.Viewing a Database Resource in APM The next step is create a new condition,  in which we define a sql predicate that includes 2 BUs ( The ids below refer to Vision Operations and Vision Germany).  At this point we have simply created a standalone condition.  We have not used this condition yet, and security is therefore not affected. Figure 4. Custom Role that inherits the Purchase Order Overview Duty We are now ready to create our Data Policy.  in APM, we search for our newly Created Role and Navigate to “Find Global Policies”.  we query the Role we want to secure and navigate to view its global policies. Figure 5. The Job Role we plan on securing We can see that the role was not defined with a Data Policy . So will create one that uses the condition we created earlier.   Figure 6. Creating a New Data Policy In the General Information tab, we have to specify the DB Resource that the Security Policy applies to:  In our case this is the BU View Figure 7. Data Policy Definition - Selection of the DB Resource we will secure by In the Rules Tab, we  make the rule applicable to multiple values of the DB Resource we selected in the previous tab.  This is where we associate the condition we created against the BU view to this data policy by entering the Condition name in the Condition field Figure 8. Data Policy Rule The last step of Defining the Data Policy, consists of  explicitly selecting  the Actions that are goverened by this Data Policy.  In this case for example we select the Actions displayed below in the right pane. Once the record is saved , we are ready to use our newly secured Data Role. Figure 9. Data Policy Actions We can now see a new Data Policy associated with our Role.  Figure 10. Role is now secured by a Data Policy We now Assign that new Role to the User.  Of course this does not have to be done in OIM and can be done using a Provisioning Rule in HCM. Figure 11. Role assigned to the User who previously was granted the Vision Ops secured role. Once that user accesses the Invoices Workarea this is what they see: In the image below the LOV of Business Unit returns the two values defined in our data policy namely: Vision Operations and Vision Germany Figure 12. The List Of Values of Business Units now includes the two we included in our data policy. This is the effect of the data role granted to that user as can be seen in Figure 11

    Read the article

  • First Day of Data Integration Track at Oracle OpenWorld 2012

    - by Irem Radzik
    OpenWorld started full speed for us today with a great set of sessions in the Data Integration track. After the exciting keynote session on Oracle Database 12c in the morning; Brad Adelberg, VP of Development for Data Integration products, presented Oracle’s data integration product strategy. His session highlighted the new requirements for data integration to achieve pervasive and continuous access to trusted data. The new requirements and product focus areas presented in this session are: Provide access to any data at any source On premise or on cloud Enable zero downtime operations and maximum performance Leverage real-time data for accurate business insights And ensure high quality data is used across the enterprise During the session Brad walked over how Oracle’s data integration products, Oracle Data Integrator, Oracle GoldenGate, Oracle Enterprise Data Quality, and Oracle Data Service Integrator, deliver on these requirements and how recent product releases build on this strategy. Soon after Brad’s session we heard from a panel of Oracle GoldenGate customers, St. Jude Medical, Equifax, and Bank of America, how they achieved zero downtime operations using Oracle GoldenGate. The panel presented different use cases of GoldenGate, from Active-Active replication to offloading reporting. Especially St. Jude Medical’s implementation, which involves the alert management system for patients that use their pacemakers, reminded me in some cases downtime of mission-critical systems can be a matter of life or death. It is very comforting to hear that GoldenGate delivers highly-reliable continuous availability for life-saving medical systems. In the afternoon, Nick Wagner from the Product Management team and I followed the customer panel with the review of Oracle GoldenGate 11gR2’s New Features.  Many questions we received from audience were about GoldenGate’s new Integrated Capture for Oracle Database and the enhanced Conflict Management features, as well as how GoldenGate compares to Oracle Streams. In addition to giving details on GoldenGate’s unique capability to capture changed data with a direct integration to the Oracle DBMS engine, we reminded the audience that enhancements to Oracle GoldenGate will continue, while Streams will be primarily maintained. Last but not least, Tim Garrod and Ryan Fonnett from Raymond James presented a unified real-time data integration solution using Oracle Data Integrator and GoldenGate for their operational data store (ODS). The ODS supports application services across the enterprise and providing timely data is a critical requirement. In this solution, Oracle GoldenGate does the log-based change data capture for Oracle Data Integrator’s near real-time data integration between heterogeneous systems. As Raymond James’ ODS supports mission-critical services for their advisors, the project team had to set up this integration environment to be highly available. During the session, Ryan and Tim explained how they use ODI to enable automated process execution and “always-on” integration processes. Their presentation included 2 demonstrations that focused on CDC patterns deployed with ODI and the automated multi-instance execution and monitoring. We are very grateful to Tim and Ryan for their very-well prepared presentation at OpenWorld this year. Day 2 (Tuesday) will be also a busy day in our track. In addition to the Fusion Middleware Innovation Awards ceremony at 11:45am at Moscone West 3001, we have the following DI sessions Real-World Operational Reporting Customer Panel 11:45am Moscone West- 3005 Oracle Data Integrator Product Update and Future Strategy 1:15pm Moscone West- 3005 High-volume OLTP with Oracle GoldenGate: Best Practices from Comcast 1:15pm Moscone West- 3005 Everything You need to Know about Monitoring Oracle GoldenGate 5pm Moscone West-3005 If you are at OpenWorld please join us in these sessions. For a full review of data integration track at OpenWorld please see our Focus-On document.

    Read the article

  • Documentation in Oracle Retail Merchandising System (RMS) and Oracle Retail Fiscal Management System (ORFM), Release 13.2.4

    - by Oracle Retail Documentation Team
    The Patch Release 13.2.4 of the Oracle Retail Merchandising System (RMS) and its module, Oracle Retail Fiscal Management (ORFM)  is now available from My Oracle Support. End User Documentation Enhancements The following summarize the highlights of changes made to the documentation in conjunction with the new Brazil-related functionality: Foundation chapter in the Oracle Retail Merchandising System (RMS)/Sales Audit (ReSA) Brazil Localization User GuideThis chapter was updated with a non-base Localization Flexible Attribution Solution (LFAS) section that addresses the addition of several new custom attributes to Items and Suppliers through non-base LFAS for Brazil; it also addresses the extension of the Retail Tax Integration Layer (RTIL) through the Oracle Retail Merchandising System (RMS), and Oracle Retail Fiscal Management System (ORFM).  ORFM User GuideThe Purchase Order chapter was updated to include schedule related updates for a Nota Fiscal. The Fiscal Documents chapter was updated to include information on creating a new NF and searching for details using Vendor Product Number. Oracle Retail Fiscal Management/RMS Brazil Localization Implementation GuideThe Implementation Checklist chapter was updated with a note on multi-currency functionality. The Batch Processes chapter was updated with information on the NF EDI batch. The following summarize the highlights of changes made to the documentation in conjunction with the new technical certifications (see the RMS 13.2.4 Release Notes for more information): Installation Guides for RMS and for ORFM/RMS BrazilThese installation guides were updated extensively to account for the multiple technical certification enhancements in 13.2.4. White Paper: How to Upgrade from WebLogic11g 10.3.3 to WebLogic11g 10.3.4  (Doc ID: 1432575.1)See the previous blog entry regarding this new White Paper. New Documents on My Oracle Support for Brazil Localization Overview and Interfaces Tax Vendor Integration (Doc ID: 1424048.1)Oracle chooses to integrate with a third party tax expert to delivery the Brazilian solution. Oracle has built the Retail Tax Integration layer (RTIL) as the key integration component to support the integration of Oracle suite of products with external tax vendors. This paper addresses the RTIL integration interfaces with TaxWeb, providing guidance on the typical integration interfaces and operations that must be supported by other tax solutions in the Brazilian market. Oracle Retail Fiscal Management/RMS Brazil Localization: Localization Flexible Attribute Solution (LFAS) (Doc ID: 1418509.1)The white paper covers the definition of custom attributes in Localization Flexible Attribute Solution (LFAS) and enables retailers to perform data conversion changes. Retailers can add several new custom attributes to Items and Suppliers through non-base LFAS for Brazil and extend Retail Tax Integration Layer (RTIL) through the Oracle Retail Merchandising System (RMS), and Oracle Retail Fiscal Management System (RFM). Documents Published in RMS and ORFM Release 13.2.4 Oracle Retail Merchandising System Release Notes Oracle Retail Merchandising System Installation Guide Oracle Retail Merchandising System User Guide and Online Help Oracle Retail Sales Audit (ReSA) User Guide and Online Help Oracle Retail Merchandising System Operations Guide Oracle Retail Merchandising System Data Model Oracle Retail Merchandising Batch Schedule Oracle Retail Merchandising Implementation Guide Oracle Retail POS Suite 13.4.1 / Merchandising Operations Management13.2.4 Implementation Guide Oracle Retail Fiscal Management Data Model Oracle Retail Fiscal Management/RMS Brazil Localization Installation Guide Oracle Retail Fiscal Management/RMS Brazil Localization Implementation Guide Oracle Retail Fiscal Management User Guide and Online Help

    Read the article

  • Could not load type System.Configuration.NameValueSectionHandler

    If you upgrade older .NET sites from 1.x to 2.x or greater, you may encounter this error when you have configuration settings that look like this: <section name="CacheSettings" type="System.Configuration.NameValueFileSectionHandler, System"/> Once you try to run this on an upgraded appdomain, you may encounter this error: An error occurred creating the configuration section handler for CacheSettings: Could not load type 'System.Configuration.NameValueSectionHandler' from assembly 'System.Configuration, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'. Microsoft moved a bunch of the Configuration related classes into a separate assembly, System.Configuration, and created a new class, ConfigurationManager.  This presents its own challenges which Ive blogged about in the past if you are wondering where ConfigurationManager is located.  However, the above error is separate. The issue in this case is that the NameValueSectionHandler is still in the System assembly, but is in the System.Configuration namespace.  This causes confusion which can be alleviated by using the following section definition: <section name="CacheSettings" type="System.Configuration.NameValueSectionHandler, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" /> (you can remove the extra line breaks within the type=) With this in place, your web application should once more be able to load up the NameValueSectionHandler.  I do recommend using your own custom configuration section handlers instead of appSettings, and I would further suggest that you not use NamveValueSectionHandler if you can avoid it, but instead prefer a strongly typed configuration section handler. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • New Study Guide: "Oracle Solaris 11 System Administration"

    - by Harold Green
    A new helpful resource for Solaris 11 exam preparation has just been released. "Oracle Solaris 11 System Administration" by author and educator Bill Calkins covers effective installation and administration of an Oracle Solaris 11 system. In addition to being a valuable, comprehensive study guide, the book also serves as a complete reference guide for the everyday tasks of an Oracle Solaris System Administrator. This book can be a valuable addition to your preparation for the Oracle Solaris 11 Advanced System Administration (1Z1-822) certification exam. This exam, combined with the Oracle Certified Associate, Oracle Solaris 11 System Administrator (OCA) certification and a training requirement will earn you the Oracle Certified Professional, Oracle Solaris 11 System Administrator (OCP) certification. This valuable credential is designed for Oracle Solaris System Administrators with a strong foundation in the Oracle Solaris 11 Operating System as well as a fundamental understanding of the UNIX operating system, commands and utilities. This certification covers topics on core elements such as: configuring network interfaces, managing swap configurations, crash dumps, and core files. The 822 exam is currently in beta at the greatly discounted rate of $50 USD, but the beta period will soon be closing (likely the end of this month/June 2013), so be take advantage of the opportunity to be one of the first to hold this new certification.  Bill Calkins also recently posted some tips for taking Oracle Solaris 11 certification exams.

    Read the article

  • xorg-edgers PPA with llvmpipe breaks AMD APU system

    - by linux_RRT
    I've read before where this had happened to another user with the same system... I was running Ubuntu 12.04 LTS with Kernel 3.2.0.29 and decided to give xorg-edgers a try. I purged fglrx* and xorg* before beginning. I upgraded with sudo apt-get upgrade -d The system downloaded and installed 109MB worth of data to the system, including llvmpipe which I am very unfamiliar with, and Kernel 3.5.0.11. The system was then rebooted to finalize the upgrade. The system boots to a black screen and then tells me "The system is running in low-graphics mode". Did I miss a step in the install? Or do the newest open-source drivers just not work with my hardware? I realize this hardware (APU) is some of the newer development. I dropped to command via the fallback menu and attempted to boot lightdm as root, but the system hangs in 'Configuring kernel parameters' at Starting initializes zram swaping. ...and then it just sits there. The other thing that concerns me is the output at the top of the screen that says: could not write bytes: Broken pipe Does llvmpipe work for this type of system? To be clear the system is: MSI x370-206us Laptop Radeon HD 6320 AMD e450 APU 1.67ghz dual-core Any help would be much appreciated. Like I said, I'm pretty sure I followed the right order of operations for the install procedure, but I was curious if anyone with similar hardware had experienced anything similar.

    Read the article

  • Areas of support needed when attempting to roll out a new software system

    In general, I think most people tend to be resistant to new systems or even change because they fear the unknown. Change means that their normal routine will be interrupted until they can learn to conform to the new routine due to the fact that it has transformed to the old routine. In addition, the feeling of failure is also generates a resistance to change. Why would a worker want to move from a process that has worked successfully for them in the past? Their fears over shadow any benefits a change in a new system or business process will bring to their work life. Areas of support needed when attempting to roll out a new software system: Executive/Upper Management Support If there is no support from the top of an organization how will employees be supportive of the new system? Proper Training Employees need to train on a new system prior to its rollout. The more training employee’s receive on any new system will directly impact how comfortable they will be with the system and are more accepting of the change because they can see how the changes will benefit them. Employee Incentives One way to re-enforce the need for employees to use a new system is to offer incentives to ensure that the system will be used. Employee Discipline/Termination If employees are adamantly refusing to use the new system after several warnings then they need to be formally reprimanded.  If this does not work the employer is forced to replace the employees.

    Read the article

  • How would I define "GetDataFromNumber" so that my class contains a definition?

    - by JB
    My code gets an error saying: 'Eagle_Eye_Class_Finder.GetSchedule' does not contain a definition for 'GetDataFromNumber' and no extension method 'GetDataFromNumber'. using System; using System.IO; using System.Data; using System.Text; using System.Drawing; using System.Data.OleDb; using System.Collections; using System.Windows.Forms; using System.ComponentModel; using System.Drawing.Printing; using System.Collections.Generic; namespace Eagle_Eye_Class_Finder { /// This form is the entry form, it is the first form the user will see when the app is run. /// public class Form1 : System.Windows.Forms.Form { private System.Windows.Forms.TextBox textBox1; private System.Windows.Forms.ProgressBar progressBar1; private System.Windows.Forms.PictureBox pictureBox1; private System.Windows.Forms.Button button2; private System.Windows.Forms.DateTimePicker dateTimePicker1; private IContainer components; private Timer timer1; private BindingSource form1BindingSource; public static Form Mainform = null; // creates new instance of second form YOURCLASSSCHEDULE SecondForm = new YOURCLASSSCHEDULE(); public Form1() { InitializeComponent(); // TODO: Add any constructor code after InitializeComponent call } /// Clean up any resources being used. protected override void Dispose(bool disposing) { if (disposing) { if (components != null) { components.Dispose(); } } base.Dispose(disposing); } #region Windows Form Designer generated code /// <summary> /// Required method for Designer support - do not modify /// the contents of this method with the code editor. /// </summary> private void InitializeComponent() { this.components = new System.ComponentModel.Container(); System.ComponentModel.ComponentResourceManager resources = new System.ComponentModel.ComponentResourceManager(typeof(Form1)); this.textBox1 = new System.Windows.Forms.TextBox(); this.progressBar1 = new System.Windows.Forms.ProgressBar(); this.pictureBox1 = new System.Windows.Forms.PictureBox(); this.button2 = new System.Windows.Forms.Button(); this.dateTimePicker1 = new System.Windows.Forms.DateTimePicker(); this.timer1 = new System.Windows.Forms.Timer(this.components); this.form1BindingSource = new System.Windows.Forms.BindingSource(this.components); ((System.ComponentModel.ISupportInitialize)(this.pictureBox1)).BeginInit(); ((System.ComponentModel.ISupportInitialize)(this.form1BindingSource)).BeginInit(); this.SuspendLayout(); // // textBox1 // this.textBox1.BackColor = System.Drawing.SystemColors.ActiveCaption; this.textBox1.DataBindings.Add(new System.Windows.Forms.Binding("Text", this.form1BindingSource, "Text", true, System.Windows.Forms.DataSourceUpdateMode.OnValidation, null, "900456317")); this.textBox1.Location = new System.Drawing.Point(328, 280); this.textBox1.Name = "textBox1"; this.textBox1.Size = new System.Drawing.Size(208, 20); this.textBox1.TabIndex = 2; this.textBox1.TextChanged += new System.EventHandler(this.textBox1_TextChanged); // // progressBar1 // this.progressBar1.Location = new System.Drawing.Point(258, 410); this.progressBar1.MarqueeAnimationSpeed = 10; this.progressBar1.Name = "progressBar1"; this.progressBar1.Size = new System.Drawing.Size(344, 8); this.progressBar1.TabIndex = 3; this.progressBar1.Click += new System.EventHandler(this.progressBar1_Click); // // pictureBox1 // this.pictureBox1.BackColor = System.Drawing.SystemColors.ControlLightLight; this.pictureBox1.BorderStyle = System.Windows.Forms.BorderStyle.Fixed3D; this.pictureBox1.Image = ((System.Drawing.Image)(resources.GetObject("pictureBox1.Image"))); this.pictureBox1.Location = new System.Drawing.Point(680, 400); this.pictureBox1.Name = "pictureBox1"; this.pictureBox1.Size = new System.Drawing.Size(120, 112); this.pictureBox1.TabIndex = 4; this.pictureBox1.TabStop = false; this.pictureBox1.Click += new System.EventHandler(this.pictureBox1_Click); // // button2 // this.button2.Font = new System.Drawing.Font("Mistral", 15.75F, System.Drawing.FontStyle.Regular, System.Drawing.GraphicsUnit.Point, ((byte)(0))); this.button2.Image = ((System.Drawing.Image)(resources.GetObject("button2.Image"))); this.button2.Location = new System.Drawing.Point(699, 442); this.button2.Name = "button2"; this.button2.Size = new System.Drawing.Size(78, 28); this.button2.TabIndex = 5; this.button2.Text = "OK"; this.button2.Click += new System.EventHandler(this.button2_Click); // // dateTimePicker1 // this.dateTimePicker1.Location = new System.Drawing.Point(336, 104); this.dateTimePicker1.Name = "dateTimePicker1"; this.dateTimePicker1.Size = new System.Drawing.Size(200, 20); this.dateTimePicker1.TabIndex = 6; this.dateTimePicker1.ValueChanged += new System.EventHandler(this.dateTimePicker1_ValueChanged); // // timer1 // this.timer1.Tick += new System.EventHandler(this.timer1_Tick); // // form1BindingSource // this.form1BindingSource.DataSource = typeof(Eagle_Eye_Class_Finder.Form1); // // Form1 // this.AcceptButton = this.button2; this.AutoScaleBaseSize = new System.Drawing.Size(5, 13); this.BackgroundImage = ((System.Drawing.Image)(resources.GetObject("$this.BackgroundImage"))); this.BackgroundImageLayout = System.Windows.Forms.ImageLayout.Stretch; this.ClientSize = new System.Drawing.Size(856, 556); this.Controls.Add(this.dateTimePicker1); this.Controls.Add(this.button2); this.Controls.Add(this.pictureBox1); this.Controls.Add(this.progressBar1); this.Controls.Add(this.textBox1); this.Name = "Form1"; this.Text = "Eagle Eye Class Finder"; this.Load += new System.EventHandler(this.Form1_Load); ((System.ComponentModel.ISupportInitialize)(this.pictureBox1)).EndInit(); ((System.ComponentModel.ISupportInitialize)(this.form1BindingSource)).EndInit(); this.ResumeLayout(false); this.PerformLayout(); } #endregion /// The main entry point for the application. [STAThread] static void Main() { Application.Run(new Form1()); } public void Form1_Load(object sender, System.EventArgs e) { } public void textBox1_TextChanged(object sender, System.EventArgs e) { //allows only numbers to be entered in textbox string Str = textBox1.Text.Trim(); double Num; bool isNum = double.TryParse(Str, out Num); if (isNum) Console.ReadLine(); else MessageBox.Show("Enter A Valid ID Number!"); } public void button2_Click(object sender, System.EventArgs e) { string text = textBox1.Text; Mainform = this; this.Hide(); GetSchedule myScheduleFinder = new GetSchedule(); string result = myScheduleFinder.GetDataFromNumber(text);<<<-----MY PROBLEM if (!string.IsNullOrEmpty(result)) { MessageBox.Show(result); } else { MessageBox.Show("Enter A Valid ID Number!"); } } public void dateTimePicker1_ValueChanged(object sender, System.EventArgs e) { } public void pictureBox1_Click(object sender, System.EventArgs e) { } public void progressBar1_Click(object sender, EventArgs e) { //this.progressBar1 = new System.progressBar1(); //progressBar1.Maximum = 200; //progressBar1.Minimum = 0; //progressBar1.Step = 20; } private void timer1_Tick(object sender, EventArgs e) { //if (progressBar1.Value >= 200 ) //{ //progressBar1.Value = 0; //} //return; //} //progressBar1.Value != 20; } } }

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >