Search Results

Search found 42367 results on 1695 pages for 'resource files'.

Page 448/1695 | < Previous Page | 444 445 446 447 448 449 450 451 452 453 454 455  | Next Page >

  • OIM 11g - Multi Valued attribute reconciliation of a child form

    - by user604275
    This topic gives a brief description on how we can do reconciliation of a child form attribute which is also multi valued from a flat file . The format of the flat file is (an example): ManagementDomain1|Entitlement1|DIRECTORY SERVER,EMAIL ManagementDomain2|Entitlement2|EMAIL PROVIDER INSTANCE - UMS,EMAIL VERIFICATION In OIM there will be a parent form for fields Management domain and Entitlement.Reconciliation will assign Servers ( which are multi valued) to corresponding Management  Domain and Entitlement .In the flat file , multi valued fields are seperated by comma(,). In the design console, Create a form with 'Server Name' as a field and make it a child form . Open the corresponding Resource Object and add this field for reconcilitaion.While adding , choose 'Multivalued' check box. (please find attached screen shot on how to add it , Child Table.docx) Open process definiton and add child form fields for recociliation. Please click on the 'Create Reconcilitaion Profile' buttton on the resource object tab. The API methods used for child form reconciliation are : 1.           reconEventKey =   reconOpsIntf.createReconciliationEvent(resObjName, reconData,                                                            false); ·                                    ‘False’  here tells that we are creating the recon for a child table . 2.               2.       reconOpsIntf.providingAllMultiAttributeData(reconEventKey, RECON_FIELD_IN_RO, true);                RECON_FIELD_IN_RO is the field that we added in the Resource Object while adding for reconciliation, please refer the screen shot) 3.    reconOpsIntf.addDirectBulkMultiAttributeData(reconEventKey,RECON_FIELD_IN_RO, bulkChildDataMapList);                 bulkChildDataMapList  is coded as below :                 List<Map> bulkChildDataMapList = new ArrayList<Map>();                   for (int i = 0; i < stokens.length; i++) {                            Map<String, String> attributeMap = new HashMap<String, String>();                           String serverName = stokens[i].toUpperCase();                           attributeMap.put("Server Name", stokens[i]);                           bulkChildDataMapList.add(attributeMap);                         } 4                  4.       reconOpsIntf.finishReconciliationEvent(reconEventKey); 5.       reconOpsIntf.processReconciliationEvent(reconEventKey); Now, we have to register the plug-in, import metadata into MDS and then create a scheduled job to execute which will run the reconciliation.

    Read the article

  • Legitimate use of the Windows "Documents" folder in programs.

    - by romkyns
    Anyone who likes their Documents folder to contain only things they place there knows that the standard Documents folder is completely unsuitable for this task. Every program seems to want to put its settings, data, or something equally irrelevant into the Documents folder, despite the fact that there are folders specifically for this job1. So that this doesn't sound empty, take my personal "Documents" folder as an example. I don't ever use it, in that I never, under any circumstances, save anything into this folder myself. And yet, it contains 46 folders and 3 files at the top level, for a total of 800 files in 500 folders. That's 190 MB of "documents" I didn't create. Obviously any actual documents would immediately get lost in this mess. My question is: can anything be done to improve the situation sufficiently to make "Documents" useful again, say over the next 5 years? Can programmers be somehow educated en-masse not to use it as a dumping ground? Could the OS start reporting some "fake" location hidden under AppData through the existing APIs, while only allowing Explorer and the various Open/Save dialogs to know where the "real" Documents folder resides? Or are any attempts completely futile or even unnecessary? 1For the record, here's a quick summary of the various standard directories that should be used instead of "Documents": RoamingAppData for user-specific data and settings. This is the directory to use for user-specific non-temporary data. Anything placed here will be available on any machine that a given user logs on to in networks where this is configured. Do not place large files here though, because they slow down login/logout in such environments. LocalAppData for user-and-machine-specific data and settings. This data differs for every user and every machine. This is also where very large user-specific data should be placed. ProgramData for machine-specific data and settings. These are the same regardless of which user is logged on, and will not roam to other machines in a network. GetTempPath for all files that may be wiped without loss of data when not in use. This is also the place for things like caches, because like temporary data, a cache does not need to be backed up. Place your huge cache here and you'll save your user some backup trouble. "Documents" itself should only ever be used if the user specified it manually by entering a path or selecting it in a Save dialog. That is the only time it is ever appropriate to save stuff in "Documents".

    Read the article

  • Unable connect to android phone using bluetooth

    - by archerry
    How do I connect my laptop to my HTC Wildfire through bluetooth? The laptop is running xubuntu 11.10, and I have updated Blueman to version 1.23 through their launchpad PPA. The phone is running android version 2.2.1. I can 'pair' and 'trust' the phone, but I can neither 'send files to device...' nor 'browse files on device...' I am able to connect and browse the phone using my netbook running xubuntu 10.04 and blueman version 1.21.

    Read the article

  • Migrate Rhythbox from one computer to another with different username

    - by deshmukh
    I want to migrate Rhythmbox from one computer to another. I have different usernames in both the computers. I will need to carry music files, covers, playcounts, ratings, playlists, etc. Merely copying music files and .local/share/rhythmbox does not work (I guess because Music locations are different on both the computers). What is the best way to achieve this? I will at least like to carry ratings and playlists.

    Read the article

  • Setting MTU on Exalogic

    - by csoto
    For many reasons, a system administrator may want to change the MTU settings of a server. But in a system like Exalogic which contains lots of interconnected nodes and other various components, it's important to understand how this applies to the different networks. For example, when bringing up bonding of InfiniBand an error like the following may be thrown: Bringing up interface bond1: SIOCSIFMTU: Invalid argument Both scripts ifcfg-ib0 and ifcfg-ib1 (from the /etc/sysconfig/network-scripts/ direectory) have MTU set to 65500, which is a valid MTU value only if all IPoIB slaves operate in connected mode and are configured with the same value, so the line below must be added to both network scripts and then restart the network: CONNECTED_MODE=yes By the way, an error of the form “SIOCSIFMTU: Invalid argument” indicates that the requested MTU was rejected by the kernel. Typically this would be due to it exceeding the maximum value supported by the interface hardware. In that case you must either reduce the MTU to a value that is supported or obtain more capable hardware. This problem has been seen when trying to modify the MTU using the ifconfig command, like the output of the example below: [root@elxxcnxx ~]# ifconfig ib1 mtu 65520 SIOCSIFMTU: Invalid argument It's important to insist that in most cases the nodes must be rebooted after the MTU size has been changed. Although in some circumstances it may work without a reboot, it is not how it is typically documented. Now, in order to achieve a reduced memory consumption and improve performance for network traffic received on IPoIB related interfaces, it is recommend to reduce the MTU value in interface configuration files for IPoIB related bonds from 65520 to 64000. The change needs to be made to interface configuration files under the /etc/sysconfig/network-scripts directory and applies to the interface configuration files for bonds over IPoIB related slave devices, for example /etc/sysconfig/network-scripts/ifcfg-bond1. However, keep in mind that the numeric portion of the interface filenames that corresponding to IPoIB interfaces is expected to vary across compute nodes and vServers and so cannot be relied upon to identify which interface files are for bonds are over IPoIB rather than EoIB related slave interfaces. To fix these MTU values to the recommended settings, there are very useful instructions and a script on the MOS Note 1624434.1, and it's applicable physical and virtual configurations of Exalogic. Regarding the recommended MTU value for EoIB related interfaces, its maximum appropriate value is 1500. If for some reason a vServer has been created with a higher value (set on the /etc/sysconfig/network-scripts/ifcfg-bond0 file), then it must be fixed. An error like the following could be thrown under this circumstance: [root@vServer ~]# service network restart ... Bringing up interface bond0:  SIOCSIFMTU: Invalid argument Also an error like the one below can be seen on the /var/log/messages file of the vServer: kernel: T5074835532 [mlx4_vnic] eth1:vnic_change_mtu:360: failed: new_mtu 64000 2026 The MOS Note 1611657.1 is very useful for this purpose.

    Read the article

  • Microsoft Sql Server 2008 R2 System Databases

    For a majority of software developers little time is spent understanding the inner workings of the database management systems (DBMS) they use to store data for their applications.  I personally place myself in this grouping. In my case, I have used various versions of Microsoft’s SQL Server (2000, 2005, and 2008 R2) and just recently learned how valuable they really are when I was preparing to deliver a lecture on "SQL Server 2008 R2, System Databases". Microsoft Sql Server 2008 R2 System DatabasesSo what are system databases in MS SQL Server, and why should I know them? Microsoft uses system databases to support the SQL Server DBMS, much like a developer uses config files or database tables to support an application. These system databases individually provide specific functionality that allows MS SQL Server to function. Name Database File Log File Master master.mdf mastlog.ldf Resource mssqlsystemresource.mdf mssqlsystemresource.ldf Model model.mdf modellog.ldf MSDB msdbdata.mdf msdblog.ldf Distribution distmdl.mdf distmdl.ldf TempDB tempdb.mdf templog.ldf Master DatabaseIf you have used MS SQL Server then you should recognize the Master database especially if you used the SQL Server Management Studio (SSMS) to connect to a user created database. MS SQL Server requires the Master database in order for DBMS to start due to the information that it stores. Examples of data stored in the Master database User Logins Linked Servers Configuration information Information on User Databases Resource DatabaseHonestly, until recently I never knew this database even existed until I started to research SQL Server system databases. The reason for this is due largely to the fact that the resource database is hidden to users. In fact, the database files are stored within the Binn folder instead of the standard MS SQL Server database folder path. This database contains all system objects that can be accessed by all other databases.  In short, this database contains all system views and store procedures that appear in all other user databases regarding system information. One of the many benefits to storing system views and store procedures in a single hidden database is the fact it improves upgrading a SQL Server database; not to mention that maintenance is decreased since only one code base has to be mainlined for all of the system views and procedures. Model DatabaseThe Model database as the name implies is the model for all new databases created by users. This allows for predefining default database objects for all new databases within a MS SQL Server instance. For example, if every database created by a user needs to have an “Audit” table when it is  created then defining the “Audit” table in the model will guarantees that the table will be located in every new database create after the model is altered. MSDB DatabaseThe MSDBdatabase is used by SQL Server Agent, SQL Server Database Mail, SQL Server Service Broker, along with SQL Server. The SQL Server Agent uses this database to store job configurations and SQL job schedules along with SQL Alerts, and Operators. In addition, this database also stores all SQL job parameters along with each job’s execution history.  Finally, this database is also used to store database backup and maintenance plans as well as details pertaining to SQL Log shipping if it is being used. Distribution DatabaseThe Distribution database is only used during replication and stores meta data and history information pertaining to the act of replication data. Furthermore, when transactional replication is used this database also stores information regarding each transaction. It is important to note that replication is not turned on by default in MS SQL Server and that the distribution database is hidden from SSMS. Tempdb DatabaseThe Tempdb as the name implies is used to store temporary data and data objects. Examples of this include temp tables and temp store procedures. It is important to note that when using this database all data and data objects are cleared from this database when SQL Server restarts. This database is also used by SQL Server when it is performing some internal operations. Typically, SQL Server uses this database for the purpose of large sort and index operations. Finally, this database is used to store row versions if row versioning or snapsot isolation transactions are being used by SQL Server. Additionally, I would love to hear from others about their experiences using system databases, tables, and objects in a real world environments.

    Read the article

  • Convert a PPT File into an Image or HTML File in .NET

    Get the .NET code for programmatically converting PowerPoint presentation files into images or HTML files....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • PHP - Auto Code Formatter?

    - by user1179459
    I am just wonedering is there a tool/software (ideally free) to do a auto code formatting in the PHP for batch of files (not one by one which i can use the IDE for that) Ideally something like this where i can set the settings and it will do the auto formatting for all the files in side that folder ...etc http://beta.phpformatter.com/ this is very useful but issue is i have to do this one by one copy pasting .. thats why i am looking for another tool..

    Read the article

  • Visual studio fast performance with splash disable

    - by anirudha
    Visual studio perform faster whenever you run them in without splash. for running them without splash you need to change some setting for that. go to shortcut icon of visual studio open the properties and see the target executable the executable location something like "C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe" for x64based computer now you need to add their “ /nosplash” the exe location now goes "C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe" /nosplash

    Read the article

  • Making a file with terminal commands?

    - by iSeth
    In Windows I can write a file containing commands for cmd (usually .cmd or .bat files). When I click on those files it will open cmd.exe and run them commands the file contains. How would I do this in Ubuntu? I'm sure this is a duplicate, but I can't find my answer. Its similar to these questions, but they don't answer the question: Store frequently used terminal commands in a file CMD.exe Emulator in Ubuntu to run .cmd/.bat file

    Read the article

  • Organizing single page code well with Notepad++

    - by Hudson
    I've a c# file that will contain, most likely 10,000+ lines of code. I'd like to break this file into tabbed segments, so I can organize each method into a certain tab and label the tab something like : initial setup, helper functions,execution logic,list structures,global variables, etc. If I were using PHP I would have separate files and use include(); to include the separate files. What can be done, following the same style, with c#?

    Read the article

  • cannot install cinnimon 2.xx

    - by user207587
    Fetched 841 kB in 28s (29.3 kB/s) W: GPG error: http://packages.mate-desktop.org raring Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 68980A0EA10B4DE8 W: Failed to fetch http://ppa.launchpad.net/pmcenery/ppa/ubuntu/dists/raring/main/binary-i386/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead. userx@bw:~$ How do I fix that? How do I add a PUBKEY to get needed files to complete install of Cinnamon?

    Read the article

  • Is there a way to schedule the downloads in Transmission BitTorrent Client?

    - by Ankit
    I have added a couple of torrent files to Transmission BitTorrent Client to be downloaded. I have a limited internet speed so I can't start all of them in a single go; so I have to peep in after every 30 minutes or so to start downloading other file(added to the client). I don't want to peep in 30 minutes; looking for a way to schedule the downloads (i.e when one download completes other files can start automatically) without manual intervention.

    Read the article

  • How to Create Network File Shares with No Passwords in Windows 8

    - by Taylor Gibb
    We have all had to connect to a network share at some point only to have the authentication dialog pop up. There are many ways around it, for example mapping a network drive, but if you have a lot of users connecting to copy some files you may want to disable the password dialog instead of distributing your password. How To Delete, Move, or Rename Locked Files in Windows HTG Explains: Why Screen Savers Are No Longer Necessary 6 Ways Windows 8 Is More Secure Than Windows 7

    Read the article

  • Why doesn't InkScape recognize the Ubuntu color palettes?

    - by Octavian Damiean
    InkScape 0.48.2 refuses to show my newly added Ubuntu color palettes in the color palettes selection menu. I have downloaded the Ubuntu color palettes for GIMP/InkScape from design.canonical.com, extracted the files and copied them to /usr/share/inkscape/palettes/ where all the other color palettes are. I've even made sure that all the files have the same rights, just in case. What am I missing?

    Read the article

  • Term for a single C++ endpoint/object file

    - by Qix
    I have heard several terms for a C++ "Codepoint" (which is what I've heard used the most often), or a .cpp file that is compiled into an object file. For instance, .cpp files can include other .cpp files (or any other file, really, so long as it compiles), but during compilation, there is really only one 'main' code file that is used/generated. I know there is a widely accepted term, I just can't recall what it is. What is the accepted term for the final .c/.cpp file used to generate an object file?

    Read the article

  • Mounting Wubi part from another Linux installation

    - by FrankSus
    hope someone can help. I need to access 11.04 ubuntu files residing on a second hard drive (dual boot XP/Ubuntu) from a 10.04 hard drive installation. The 11.04 system co-exists with XP on the second drive. This drive is HPFS/NTFS. My 10.04 installation mounts and displays the contents of the XP installation and all files but the 11.04 system is nowhere to be seen. How can I see and access the Ubuntu partition, please?

    Read the article

  • LAG function – practical use and comparison to old syntax

    - by Michael Zilberstein
    Recently I had to analyze huge trace – 46GB of trc files. Looping over files I loaded them into trace table using fn_trace_gettable function and filters I could use in order to filter out irrelevant data. I ended up with 6.5 million rows table, total of 7.4GB in size. It contained RowNum column which was defined as identity, primary key, clustered. One of the first things I detected was that although time difference between first and last events in the trace was 10 hours, total duration of all sql...(read more)

    Read the article

  • LAG function – practical use and comparison to old syntax

    - by Michael Zilberstein
    Recently I had to analyze huge trace – 46GB of trc files. Looping over files I loaded them into trace table using fn_trace_gettable function and filters I could use in order to filter out irrelevant data. I ended up with 6.5 million rows table, total of 7.4GB in size. It contained RowNum column which was defined as identity, primary key, clustered. One of the first things I detected was that although time difference between first and last events in the trace was 10 hours, total duration of all sql...(read more)

    Read the article

  • Using XA Transactions in Coherence-based Applications

    - by jpurdy
    While the costs of XA transactions are well known (e.g. increased data contention, higher latency, significant disk I/O for logging, availability challenges, etc.), in many cases they are the most attractive option for coordinating logical transactions across multiple resources. There are a few common approaches when integrating Coherence into applications via the use of an application server's transaction manager: Use of Coherence as a read-only cache, applying transactions to the underlying database (or any system of record) instead of the cache. Use of TransactionMap interface via the included resource adapter. Use of the new ACID transaction framework, introduced in Coherence 3.6.   Each of these may have significant drawbacks for certain workloads. Using Coherence as a read-only cache is the simplest option. In this approach, the application is responsible for managing both the database and the cache (either within the business logic or via application server hooks). This approach also tends to provide limited benefit for many workloads, particularly those workloads that either have queries (given the complexity of maintaining a fully cached data set in Coherence) or are not read-heavy (where the cost of managing the cache may outweigh the benefits of reading from it). All updates are made synchronously to the database, leaving it as both a source of latency as well as a potential bottleneck. This approach also prevents addressing "hot data" problems (when certain objects are updated by many concurrent transactions) since most database servers offer no facilities for explicitly controlling concurrent updates. Finally, this option tends to be a better fit for key-based access (rather than filter-based access such as queries) since this makes it easier to aggressively invalidate cache entries without worrying about when they will be reloaded. The advantage of this approach is that it allows strong data consistency as long as optimistic concurrency control is used to ensure that database updates are applied correctly regardless of whether the cache contains stale (or even dirty) data. Another benefit of this approach is that it avoids the limitations of Coherence's write-through caching implementation. TransactionMap is generally used when Coherence acts as system of record. TransactionMap is not generally compatible with write-through caching, so it will usually be either used to manage a standalone cache or when the cache is backed by a database via write-behind caching. TransactionMap has some restrictions that may limit its utility, the most significant being: The lock-based concurrency model is relatively inefficient and may introduce significant latency and contention. As an example, in a typical configuration, a transaction that updates 20 cache entries will require roughly 40ms just for lock management (assuming all locks are granted immediately, and excluding validation and writing which will require a similar amount of time). This may be partially mitigated by denormalizing (e.g. combining a parent object and its set of child objects into a single cache entry), at the cost of increasing false contention (e.g. transactions will conflict even when updating different child objects). If the client (application server JVM) fails during the commit phase, locks will be released immediately, and the transaction may be partially committed. In practice, this is usually not as bad as it may sound since the commit phase is usually very short (all locks having been previously acquired). Note that this vulnerability does not exist when a single NamedCache is used and all updates are confined to a single partition (generally implying the use of partition affinity). The unconventional TransactionMap API is cumbersome but manageable. Only a few methods are transactional, primarily get(), put() and remove(). The ACID transactions framework (accessed via the Connection class) provides atomicity guarantees by implementing the NamedCache interface, maintaining its own cache data and transaction logs inside a set of private partitioned caches. This feature may be used as either a local transactional resource or as logging XA resource. However, a lack of database integration precludes the use of this functionality for most applications. A side effect of this is that this feature has not seen significant adoption, meaning that any use of this is subject to the usual headaches associated with being an early adopter (greater chance of bugs and greater risk of hitting an unoptimized code path). As a result, for the moment, we generally recommend against using this feature. In summary, it is possible to use Coherence in XA-oriented applications, and several customers are doing this successfully, but it is not a core usage model for the product, so care should be taken before committing to this path. For most applications, the most robust solution is normally to use Coherence as a read-only cache of the underlying data resources, even if this prevents taking advantage of certain product features.

    Read the article

  • Get selected object from TreeView

    - by GoGoDo
    I've been working on a minor (first time) app with quickly and hit a hurdle - how do I get the selected row (the data) from a TreeView? The data to the TreeView is passed from a list of files in a directory, and I need to know which rows were selected (and thus which files were). What is the best way to do that? Here's the current code: self.treeview = self.builder.get_object("treeview") select = self.treeview.get_selection() select.connect("changed", self.on_tree_selection_changed) def on_tree_selection_changed(selection): model, treeiter = self.treeview.selection-get() if treeiter != None: print "You selected", model[treeiter][0]

    Read the article

  • WSS - Server Error in "/" Application. Compilation Error Message: CS1006: Could not write to output

    - by ptahiliani
    I got the above errror when I tried to run WSS default site after installing and running the Advance System Optimizer 3.o. I resolve this by going to the following locations and adding permission for the admin users accounts (ASP.NET & IIS_WPG) I have set up for Sharepoint. C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files C:\WINDOWS\System 32\Log Files C:\WINDOWS\Temp After the correct permissions have been added, Sharepoint works as normal.

    Read the article

  • Mass audio encoder

    - by bessman
    I have a few thousand FLAC files which I would like to transcode to OGG Vorbis, but I can't find any suitable tools for the job. To name a few I have tried so far and why they are unsuitable: oggenc is single-threaded and would require me to automate it myself, mencoder requires the input to also contain video, and abcde assumes the input is a CD. The ideal tool should be multi-threaded, and support inputing multiple files located in different directories simultaneously. CLI or GUI makes no matter. Does such a tool exist?

    Read the article

  • How to Get Information About Your Backups

    When you need to restore but aren't 100% sure about the contents of your backup files, what do you do? Head to the headers. Grant Fritchey explains how to find the useful bits in these huge stores of information and make sure you restore the right files. New! SQL Monitor 3.0 Red Gate's multi-server performance monitoring and alerting tool gets results from Day One.Simple to install and easy to use – download a free trial today.

    Read the article

  • Avira Software Update Mistakenly Disabled Windows PCs

    While Avira currently holds the number two ranking in terms of usage amongst antivirus manufacturers worldwide, its latest slipup will likely put a dent in its reputation. The problem with the latest service pack can be pinpointed to ProActiv, a program that monitors for any suspicious events that could lead to infection or attack. Users who applied the updates noticed that ProActiv was preventing their systems from booting, as critical Windows files could not run. Others also reported that ProActiv was blocking all .exe, or executable files, in Windows, making it impossible to launch appl...

    Read the article

< Previous Page | 444 445 446 447 448 449 450 451 452 453 454 455  | Next Page >