Search Results

Search found 11996 results on 480 pages for 'dependency properties'.

Page 23/480 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • SQL SERVER – Core Concepts – Elasticity, Scalability and ACID Properties – Exploring NuoDB an Elastically Scalable Database System

    - by pinaldave
    I have been recently exploring Elasticity and Scalability attributes of databases. You can see that in my earlier blog posts about NuoDB where I wanted to look at Elasticity and Scalability concepts. The concepts are very interesting, and intriguing as well. I have discussed these concepts with my friend Joyti M and together we have come up with this interesting read. The goal of this article is to answer following simple questions What is Elasticity? What is Scalability? How ACID properties vary from NOSQL Concepts? What are the prevailing problems in the current database system architectures? Why is NuoDB  an innovative and welcome change in database paradigm? Elasticity This word’s original form is used in many different ways and honestly it does do a decent job in holding things together over the years as a person grows and contracts. Within the tech world, and specifically related to software systems (database, application servers), it has come to mean a few things - allow stretching of resources without reaching the breaking point (on demand). What are resources in this context? Resources are the usual suspects – RAM/CPU/IO/Bandwidth in the form of a container (a process or bunch of processes combined as modules). When it is about increasing resources the simplest idea which comes to mind is the addition of another container. Another container means adding a brand new physical node. When it is about adding a new node there are two questions which comes to mind. 1) Can we add another node to our software system? 2) If yes, does adding new node cause downtime for the system? Let us assume we have added new node, let us see what the new needs of the system are when a new node is added. Balancing incoming requests to multiple nodes Synchronization of a shared state across multiple nodes Identification of “downstate” and resolution action to bring it to “upstate” Well, adding a new node has its advantages as well. Here are few of the positive points Throughput can increase nearly horizontally across the node throughout the system Response times of application will increase as in-between layer interactions will be improved Now, Let us put the above concepts in the perspective of a Database. When we mention the term “running out of resources” or “application is bound to resources” the resources can be CPU, Memory or Bandwidth. The regular approach to “gain scalability” in the database is to look around for bottlenecks and increase the bottlenecked resource. When we have memory as a bottleneck we look at the data buffers, locks, query plans or indexes. After a point even this is not enough as there needs to be an efficient way of managing such large workload on a “single machine” across memory and CPU bound (right kind of scheduling)  workload. We next move on to either read/write separation of the workload or functionality-based sharing so that we still have control of the individual. But this requires lots of planning and change in client systems in terms of knowing where to go/update/read and for reporting applications to “aggregate the data” in an intelligent way. What we ideally need is an intelligent layer which allows us to do these things without us getting into managing, monitoring and distributing the workload. Scalability In the context of database/applications, scalability means three main things Ability to handle normal loads without pressure E.g. X users at the Y utilization of resources (CPU, Memory, Bandwidth) on the Z kind of hardware (4 processor, 32 GB machine with 15000 RPM SATA drives and 1 GHz Network switch) with T throughput Ability to scale up to expected peak load which is greater than normal load with acceptable response times Ability to provide acceptable response times across the system E.g. Response time in S milliseconds (or agreed upon unit of measure) – 90% of the time The Issue – Need of Scale In normal cases one can plan for the load testing to test out normal, peak, and stress scenarios to ensure specific hardware meets the needs. With help from Hardware and Software partners and best practices, bottlenecks can be identified and requisite resources added to the system. Unfortunately this vertical scale is expensive and difficult to achieve and most of the operational people need the ability to scale horizontally. This helps in getting better throughput as there are physical limits in terms of adding resources (Memory, CPU, Bandwidth and Storage) indefinitely. Today we have different options to achieve scalability: Read & Write Separation The idea here is to do actual writes to one store and configure slaves receiving the latest data with acceptable delays. Slaves can be used for balancing out reads. We can also explore functional separation or sharing as well. We can separate data operations by a specific identifier (e.g. region, year, month) and consolidate it for reporting purposes. For functional separation the major disadvantage is when schema changes or workload pattern changes. As the requirement grows one still needs to deal with scale need in manual ways by providing an abstraction in the middle tier code. Using NOSQL solutions The idea is to flatten out the structures in general to keep all values which are retrieved together at the same store and provide flexible schema. The issue with the stores is that they are compromising on mostly consistency (no ACID guarantees) and one has to use NON-SQL dialect to work with the store. The other major issue is about education with NOSQL solutions. Would one really want to make these compromises on the ability to connect and retrieve in simple SQL manner and learn other skill sets? Or for that matter give up on ACID guarantee and start dealing with consistency issues? Hybrid Deployment – Mac, Linux, Cloud, and Windows One of the challenges today that we see across On-premise vs Cloud infrastructure is a difference in abilities. Take for example SQL Azure – it is wonderful in its concepts of throttling (as it is shared deployment) of resources and ability to scale using federation. However, the same abilities are not available on premise. This is not a mistake, mind you – but a compromise of the sweet spot of workloads, customer requirements and operational SLAs which can be supported by the team. In today’s world it is imperative that databases are available across operating systems – which are a commodity and used by developers of all hues. An Ideal Database Ability List A system which allows a linear scale of the system (increase in throughput with reasonable response time) with the addition of resources A system which does not compromise on the ACID guarantees and require developers to learn new paradigms A system which does not force fit a new way interacting with database by learning Non-SQL dialect A system which does not force fit its mechanisms for providing availability across its various modules. Well NuoDB is the first database which has all of the above abilities and much more. In future articles I will cover my hands-on experience with it. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: NuoDB

    Read the article

  • Centos 6.2 postfix install dependency issues

    - by Mishari
    I am administrating a VPS running cPanel and I'm trying to install postfix. Redhat-release says the version is CentOS release 6.2 (Final) and uname -a says: Linux server.mydomain.com 2.6.32-220.el6.i686 #1 SMP Tue Dec 6 16:15:40 GMT 2011 i686 i686 i386 GNU/Linux This is how I'm installing postfix (I had tried to solve the problem earlier by installing epel). # yum install postfix Loaded plugins: fastestmirror, security Loading mirror speeds from cached hostfile * epel: mirror.cogentco.com Setting up Install Process Resolving Dependencies --> Running transaction check ---> Package postfix.i686 2:2.6.6-2.2.el6_1 will be installed --> Processing Dependency: mysql-libs for package: 2:postfix-2.6.6-2.2.el6_1.i686 --> Finished Dependency Resolution Error: Package: 2:postfix-2.6.6-2.2.el6_1.i686 (centos-burstnet) Requires: mysql-libs You could try using --skip-broken to work around the problem Attempts to install mysql-libs tells me several files conflict with "MySQL-server-5.1.61-0.glibc23.i386" I'm not sure why or how this is happening, does anyone know how to resolve this? Surely Centos 6.2 could not have shipped with a broken postfix.

    Read the article

  • how to solve eclipse's Type The project was not built due to "Could not delete

    - by user50680
    when I change a properties file's content, Eclipse always show error,say "Description Resource Path Location Type The project was not built due to "Could not delete '/lichong-test-tester/target/test-classes/config'.". Fix the problem, then try refreshing this project and building it since it may be inconsistent lichong-test-tester Unknown Java Problem ". I have to clean and rebuild whole project to solve this problem ,can anybody tell me how to avoid this. https://skydrive.live.com/redir.aspx?cid=02a1e6543b4cc73e&resid=2A1E6543B4CC73E!458&parid=root that's my Screenshot

    Read the article

  • How Magnetic Levitation Works

    - by Akemi Iwaya
    There are multiple ways to create magnets or make objects magnetic, but the most ‘interesting’ version is the one that can affect normal ‘non-magnetic’ everyday objects. When they are placed within a magnetic field, everyday objects will start displaying diamagnetic properties and react to magnets. Watch and enjoy as MinutePhysics discusses this awesome type of magnetism in their latest video. Magnetic Levitation [YouTube]     

    Read the article

  • Get Alfresco extension properties with OpenCMIS

    - by AJPerez
    I'm writing an OpenCMIS based application, which extracts some data from Alfresco 3.3. It works fine with standard CMIS properties such as cmis:name or cmis:contentStreamMimeType; however, I can't access Alfresco especific properties, which are present on the CMIS AtomPub feed as "Alfresco extensions": <cmisra:object> <cmis:properties> <cmis:propertyString propertyDefinitionId="cmis:name" displayName="Name" queryName="cmis:name"> <cmis:value>test document</cmis:value> </cmis:propertyString> <cmis:propertyString propertyDefinitionId="cmis:contentStreamMimeType" displayName="Content Stream MIME Type" queryName="cmis:contentStreamMimeType"> <cmis:value>text/html</cmis:value> </cmis:propertyString> ... <alf:aspects> ... <alf:properties> <cmis:propertyString propertyDefinitionId="cm:description" displayName="Description" queryName="cm:description"> <cmis:value>This is just a test document</cmis:value> </cmis:propertyString> </alf:properties> </alf:aspects> </cmis:properties> </cmisra:object> Is there any way in which I can get the value of cm:descripcion, with OpenCMIS? My guess is that I need to use the DocumentType interface instead of Document, and then call its getExtensions() method. But I don't know how to get an instance of DocumentType. Any help would be really appreciated. Regards Edit: altough Florian's answer already worked out for me, I've just realized that I can get these properties' values with CMIS SQL, too: select d.*, t.*, a.* from cmis:document d join cm:titled t on d.cmis:objectid = t.cmis:objectid join cm:author a on d.cmis:objectid = a.cmis:objectid where t.cm:description like ...

    Read the article

  • Crawler do not create custom crawled properties

    - by user173739
    These days i have faced with very strange problem. I have development environment with MOSS 2007 SP 2 and WS 2008, i have search configured and everything works great. I have started to configuring staging environment (MOSS 2007 SP2 with June CU) and create new farm and new SSP. I have deployed my changes with package (wsp) and manually create site collections, sub webs, pages and so on. When fill crawl finishes, i see in Crawl log that all my pages have been successfully crawled and when i use some test tools to query search, my pages have been found. In crawl log there is few errors like http://mysite/sites/de/pages "The crawler could not communicate with the server. Check that the server is available and that the firewall access is configured correctly..", but all pages in this Page library were indexed. The problem is that i use custom managed properties (mapped to custom crawled properties) in search queries, but crawler didn't create crawled properties for all my new site columns. For example for site column IsAccent the crawler didn't create cralwed property ows_isAccesnt. I'm sure that i have created pages for specific content type and all my crawl categories have checked "Automatically discover new properties when a crawl takes place ". In site settings - Searchable columns i haven't got any column selected as Nocrowl. I tried to export my managed and crawled properties from dev environment to stage evironment but all my managed properties were empty, after that i recreated SSP...the result was the same... I checked specific page with tools like Sharepoint Manager 2007 and U2U Caml Query Builder 2007 that content type is correct, and i can see values of my custom site collumns.... Using U2U Caml Query Builder 2007 agains some Page library in Result tab i can see ows_IsAccent (my site collumn is IsAccent) and others site columns, but i can't find them in Crawled properties. Any idias?

    Read the article

  • Blackberry read local properties file in project

    - by Dachmt
    Hi, I have a config.properties file at the root of my blackberry project (same place as Blackberry_App_Descriptor.xml file), and I try to access the file to read and write into it. See below my class: public class Configuration { private String file; private String fileName; public Configuration(String pathToFile) { this.fileName = pathToFile; try { // Try to load the file and read it System.out.println("---------- Start to read the file"); file = readFile(fileName); System.out.println("---------- Property file:"); System.out.println(file); } catch (Exception e) { System.out.println("---------- Error reading file"); System.out.println(e.getMessage()); } } /** * Read a file and return it in a String * @param fName * @return */ private String readFile(String fName) { String properties = null; try { System.out.println("---------- Opening the file"); //to actually retrieve the resource prefix the name of the file with a "/" InputStream is = this.getClass().getResourceAsStream(fName); //we now have an input stream. Create a reader and read out //each character in the stream. System.out.println("---------- Input stream"); InputStreamReader isr = new InputStreamReader(is); char c; System.out.println("---------- Append string now"); while ((c = (char)isr.read()) != -1) { properties += c; } } catch (Exception e) { } return properties; } } I call my class constructor like this: Configuration config = new Configuration("/config.properties"); So in my class, "file" should have all the content of the config.properties file, and the fileName should have this value "/config.properties". But the "name" is null because the file cannot be found... I know this is the path of the file which should be different, but I don't know what i can change... The class is in the package com.mycompany.blackberry.utils Thank you!

    Read the article

  • MVVM Light - master / child views and dependency properties

    - by Carl Dickinson
    I'm getting an odd problem when implementing a master / child view and custom dependency properties. Within my master view I'm binding the view model declaratively in the XAML as follows: DataContext="{Binding MainViewModelProperty, Source={StaticResource Locator}}" and my MainViewModel is exposing an observable collection which I'm binding to an ItemsControl as follows: <ItemsControl ItemsSource="{Binding Lists}" Height="490" Canvas.Top="10" Width="70"> <ItemsControl.ItemTemplate> <DataTemplate> <Canvas> <local:TaskListControl Canvas.Left="{Binding ListLeft}" Canvas.Top="{Binding ListTop}" Width="{Binding ListWidth}" Height="{Binding ListHeight}" ListDetails="{Binding}"/> </Canvas> </DataTemplate> </ItemsControl.ItemTemplate> </ItemsControl> TaskListControl in turn declares and bind to it's ViewModel and I've also defined a dependency property for the ListDetails property. The ListDetails property is not being set and if I remove the declarative reference to it's viewmodel the dependency property's callback does get fired. Is there a conflict with declaratively binding to viewmodels and definig dependency properties? I really like MVVM Light's blendability and want to perserve with this problem so any help would be apprectiated. If you'd like to receive the source for my project then please ask

    Read the article

  • Combining properties made available via webservices profile service aspnet

    - by Adam
    I really wasn't sure what the title for my question could be, so sorry if it's a bit vague. I'm working on an application that uses client application services for authentication/profile management etc. In web.config for my website, I have the following profile properties like this: <properties> <add name="FirstName" type="string" defaultValue="" customProviderData="FirstName;nvarchar"/> ... Basic things like first name, last name etc. I'm exposing properties for my client app like this: <system.web.extensions> <scripting> <webServices> <authenticationService enabled="true" requireSSL="false"/> <profileService enabled="true" readAccessProperties="UserProfile" writeAccessProperties="UserProfile"/> <roleService enabled="true"/> </webServices> </scripting> </system.web.extensions> What I'm wondering is whether it's possible to bundle all the individual profile properties into a single object for client apps to utilize? I originally had all my profile data stored as members of a single class (UserProfile) but I broke it all out so that I could use the SqlTableProfileProvider to store each field as individual columns in relevant tables. I know I can create an class with members for each type, I'm just not sure if there's an easy way to create an object with all my property values (other than assigning values to this object whenever I assign to the the standalone properties). I don't think I'm explaining this very well, so I'll try an example. Say in my website profile I have FirstName and LastName as properties. For my client application profileService I want to have one ReadAccessProperty FullName. Is there some way to automatically create FullName from the existing FirstName and LastName properties without having to also have a seperate FullName property (and manually assign data to it whenever I assign data to FirstName and LastName)?

    Read the article

  • Dependency Property ListBox

    - by developer
    Hi All, I want to use a dependency property, so that my label displays values selected in the listbox. This is just to more clearly understand the working of a dependency property. <Window x:Class="WpfApplication1.Window1" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:WPFToolkit="clr-namespace:Microsoft.Windows.Controls;assembly=WPFToolkit" xmlns:local="clr-namespace:WpfApplication1" x:Name="MyWindow" Height="200" Width="300" > <StackPanel> <ListBox x:Name="lbColor" Width="248" Height="56" ItemsSource="{Binding TestColor}"/> <StackPanel> <Label Content="{Binding Path=Test, ElementName=lbColor}" /> </StackPanel> </StackPanel> </Window> Code Behind, namespace WpfApplication1 { /// <summary> /// Interaction logic for Window1.xaml /// </summary> public partial class Window1 : Window { public ObservableCollection<string> TestColor { get; set; } public String Test { get { return (String)GetValue(TestProperty); } set { SetValue(TestProperty, value); } } // Using a DependencyProperty as the backing store for Title. This enables animation, styling, binding, etc... public static readonly DependencyProperty TestProperty = DependencyProperty.Register("Test", typeof(String), typeof(ListBox), new UIPropertyMetadata("Test1")); public Window1() { InitializeComponent(); TestColor = new ObservableCollection<string>(); DataContext = this; TestColor.Add("Red"); TestColor.Add("Orange"); TestColor.Add("Yellow"); TestColor.Add("Green"); TestColor.Add("Blue"); } } } Can anyone explain me how will I accompalish this using a dependency property. Somehow I am very confused with the Dependency Property concept, and I just wanted to see a working example for that.

    Read the article

  • passing groups of properties from 1 class to another

    - by insanepaul
    I have a group of 13 properties in a class. I created a struct for these properties and passed it to another class. I need to add another 10 groups of these 13 properties. So thats 130 properties in total. What do I do? I could add all 130 properties to the struct. Will this affect performance and readability I could create a list of structs but don't know how to access an item eg. to add to the list: listRowItems.Add(new RowItems(){a=1, b=1, c=1, d=1...}); listRowItems.Add(new RowItems(){a=2, b=2, c=2, d=2...}); How do I access the second group item b? is it Could I use just a dictionary with 130 items Should I use a list of dictionaries (again I don't know how to access a particular item) Should I pass in a class of 130 properties Just for your interest the properties are css parameters used for a composite control. The control displays 13 elements in each row and there are 10 rows and each row is customisable.

    Read the article

  • C#: How to get all public (both get and set) string properties of a type

    - by Svish
    I am trying to make a method that will go through a list of generic objects and replace all their properties of type string which is either null or empty with a replacement. How is a good way to do this? I have this kind of... shell... so far: public static void ReplaceEmptyStrings<T>(List<T> list, string replacement) { var properties = typeof(T).GetProperties( -- What BindingFlags? -- ); foreach(var p in properties) { foreach(var item in list) { if(string.IsNullOrEmpty((string) p.GetValue(item, null))) p.SetValue(item, replacement, null); } } } So, how do I find all the properties of a type that are: Of type string Has public get Has public set ? I made this test class: class TestSubject { public string Public; private string Private; public string PublicPublic { get; set; } public string PublicPrivate { get; private set; } public string PrivatePublic { private get; set; } private string PrivatePrivate { get; set; } } The following does not work: var properties = typeof(TestSubject) .GetProperties(BindingFlags.Instance|BindingFlags.Public) .Where(ø => ø.CanRead && ø.CanWrite) .Where(ø => ø.PropertyType == typeof(string)); If I print out the Name of those properties I get there, I get: PublicPublic PublicPrivate PrivatePublic In other words, I get two properties too much. Note: This could probably be done in a better way... using nested foreach and reflection and all here... but if you have any great alternative ideas, please let me know cause I want to learn!

    Read the article

  • Debian apt dependency mismatch (libc6)

    - by Sean Gordon
    Earlier, I tried to install package via apt-get (cython), but it failed with the Errors were encountered while processing: message, and since then, apt is refusing to install anything. apt-get check output below: root@dix:~# apt-get check Reading package lists... Done Building dependency tree Reading state information... Done You might want to run 'apt-get -f install' to correct these. The following packages have unmet dependencies: libc6 : Depends: libc-bin (= 2.11.3-2) but 2.11.3-4 is installed libc6-dev : Depends: libc6 (= 2.11.3-4) but 2.11.3-2 is installed libc6-i386 : Depends: libc6 (= 2.11.3-4) but 2.11.3-2 is installed E: Unmet dependencies. Try using -f. Apt/aptitude don't seem to be able to fix this dependency issue, and I don't know what to do. Edit: Running apt-get -f install results in no change, and my sources are all squeeze. Running apt-get update then apt-get dist-upgrade show no change either. Edit 2: I went back to try this again in a new terminal and apt-get -f install gives this error: dpkg: error processing /var/cache/apt/archives/libc6_2.11.3-4_amd64.deb (--unpack): subprocess new pre-installation script killed by signal (Aborted) configured to not write apport reports Errors were encountered while processing: /var/cache/apt/archives/libc6_2.11.3-4_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) Edit 3: Using apt-get clean first, then the previous commands, results in the first error again. Using apt-get -f dist-upgrade gives the below. Reading package lists... Building dependency tree... Reading state information... Correcting dependencies... Done The following packages will be upgraded: apache2 apache2-mpm-prefork apache2-utils apache2.2-bin apache2.2-common at automake base-files bind9 bind9-doc bind9-host bind9utils debian-archive-keyring dnsutils dpkg-dev file host initscripts isc-dhcp-client isc-dhcp-common krb5-multidev libapr1 libbind9-60 libc6 libdns69 libdpkg-perl libexpat1 libexpat1-dev libgc1c2 libgssapi-krb5-2 libgssrpc4 libisc62 libisccc60 libisccfg62 libk5crypto3 libkadm5clnt-mit7 libkadm5srv-mit7 libkdb5-4 libkrb5-3 libkrb5-dev libkrb5support0 liblwres60 libmagic1 libmysqlclient16 libnss3-1d libssl-dev libssl0.9.8 libtiff4 libtiff4-dev libtiffxx0c2 libxi6 libxml2 linux-libc-dev lwresd mysql-client-5.1 mysql-common mysql-server mysql-server-5.1 mysql-server-core-5.1 openjdk-6-jre openjdk-6-jre-headless openjdk-6-jre-lib openssh-client openssh-server openssl procps python python-crypto python-minimal sudo sysv-rc sysvinit sysvinit-utils tzdata tzdata-java 75 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 5 not fully installed or removed. Need to get 0 B/79.9 MB of archives. After this operation, 1,411 kB of additional disk space will be used. (Reading database ... 52241 files and directories currently installed.) Preparing to replace libc6 2.11.3-2 (using .../libc6_2.11.3-4_amd64.deb) ... *** stack smashing detected ***: /usr/bin/perl terminated ======= Backtrace: ========= /lib/libc.so.6(__fortify_fail+0x37)[0x7fdaad9b9f87] /lib/libc.so.6(__fortify_fail+0x0)[0x7fdaad9b9f50] /usr/lib/libperl.so.5.10(Perl_yylex+0x5896)[0x7fdaae343346] [0x8e83a0] ======= Memory map: ======== 00400000-00402000 r-xp 00000000 08:01 525338 /usr/bin/perl 00601000-00602000 rw-p 00001000 08:01 525338 /usr/bin/perl 00602000-0091f000 rw-p 00000000 00:00 0 [heap] 7fdaaca54000-7fdaaca6a000 r-xp 00000000 08:01 393818 /lib/libgcc_s.so.1 7fdaaca6a000-7fdaacc69000 ---p 00016000 08:01 393818 /lib/libgcc_s.so.1 7fdaacc69000-7fdaacc6a000 rw-p 00015000 08:01 393818 /lib/libgcc_s.so.1 7fdaacc6a000-7fdaacc6f000 r-xp 00000000 08:01 524949 /usr/lib/perl5/auto/Locale/gettext/gettext.so 7fdaacc6f000-7fdaace6e000 ---p 00005000 08:01 524949 /usr/lib/perl5/auto/Locale/gettext/gettext.so 7fdaace6e000-7fdaace6f000 rw-p 00004000 08:01 524949 /usr/lib/perl5/auto/Locale/gettext/gettext.so 7fdaace6f000-7fdaace79000 r-xp 00000000 08:01 532753 /usr/lib/perl/5.10.1/auto/Encode/Encode.so 7fdaace79000-7fdaad078000 ---p 0000a000 08:01 532753 /usr/lib/perl/5.10.1/auto/Encode/Encode.so 7fdaad078000-7fdaad079000 rw-p 00009000 08:01 532753 /usr/lib/perl/5.10.1/auto/Encode/Encode.so 7fdaad079000-7fdaad07e000 r-xp 00000000 08:01 525444 /usr/lib/perl/5.10.1/auto/IO/IO.so 7fdaad07e000-7fdaad27d000 ---p 00005000 08:01 525444 /usr/lib/perl/5.10.1/auto/IO/IO.so 7fdaad27d000-7fdaad27e000 rw-p 00004000 08:01 525444 /usr/lib/perl/5.10.1/auto/IO/IO.so 7fdaad27e000-7fdaad299000 r-xp 00000000 08:01 525450 /usr/lib/perl/5.10.1/auto/POSIX/POSIX.so 7fdaad299000-7fdaad498000 ---p 0001b000 08:01 525450 /usr/lib/perl/5.10.1/auto/POSIX/POSIX.so 7fdaad498000-7fdaad49b000 rw-p 0001a000 08:01 525450 /usr/lib/perl/5.10.1/auto/POSIX/POSIX.so 7fdaad49b000-7fdaad49e000 r-xp 00000000 08:01 525436 /usr/lib/perl/5.10.1/auto/Fcntl/Fcntl.so 7fdaad49e000-7fdaad69e000 ---p 00003000 08:01 525436 /usr/lib/perl/5.10.1/auto/Fcntl/Fcntl.so 7fdaad69e000-7fdaad69f000 rw-p 00003000 08:01 525436 /usr/lib/perl/5.10.1/auto/Fcntl/Fcntl.so 7fdaad69f000-7fdaad6a7000 r-xp 00000000 08:01 393824 /lib/libcrypt-2.11.3.so 7fdaad6a7000-7fdaad8a6000 ---p 00008000 08:01 393824 /lib/libcrypt-2.11.3.so 7fdaad8a6000-7fdaad8a7000 r--p 00007000 08:01 393824 /lib/libcrypt-2.11.3.so 7fdaad8a7000-7fdaad8a8000 rw-p 00008000 08:01 393824 /lib/libcrypt-2.11.3.so 7fdaad8a8000-7fdaad8d6000 rw-p 00000000 00:00 0 7fdaad8d6000-7fdaada2f000 r-xp 00000000 08:01 393822 /lib/libc-2.11.3.so 7fdaada2f000-7fdaadc2e000 ---p 00159000 08:01 393822 /lib/libc-2.11.3.so 7fdaadc2e000-7fdaadc32000 r--p 00158000 08:01 393822 /lib/libc-2.11.3.so 7fdaadc32000-7fdaadc33000 rw-p 0015c000 08:01 393822 /lib/libc-2.11.3.so 7fdaadc33000-7fdaadc38000 rw-p 00000000 00:00 0 7fdaadc38000-7fdaadc4f000 r-xp 00000000 08:01 393248 /lib/libpthread-2.11.3.so 7fdaadc4f000-7fdaade4e000 ---p 00017000 08:01 393248 /lib/libpthread-2.11.3.so 7fdaade4e000-7fdaade4f000 r--p 00016000 08:01 393248 /lib/libpthread-2.11.3.so 7fdaade4f000-7fdaade50000 rw-p 00017000 08:01 393248 /lib/libpthread-2.11.3.so 7fdaade50000-7fdaade54000 rw-p 00000000 00:00 0 7fdaade54000-7fdaaded4000 r-xp 00000000 08:01 393826 /lib/libm-2.11.3.so 7fdaaded4000-7fdaae0d4000 ---p 00080000 08:01 393826 /lib/libm-2.11.3.so 7fdaae0d4000-7fdaae0d5000 r--p 00080000 08:01 393826 /lib/libm-2.11.3.so 7fdaae0d5000-7fdaae0d6000 rw-p 00081000 08:01 393826 /lib/libm-2.11.3.so 7fdaae0d6000-7fdaae0d8000 r-xp 00000000 08:01 393825 /lib/libdl-2.11.3.so 7fdaae0d8000-7fdaae2d8000 ---p 00002000 08:01 393825 /lib/libdl-2.11.3.so 7fdaae2d8000-7fdaae2d9000 r--p 00002000 08:01 393825 /lib/libdl-2.11.3.so 7fdaae2d9000-7fdaae2da000 rw-p 00003000 08:01 393825 /lib/libdl-2.11.3.so 7fdaae2da000-7fdaae43f000 r-xp 00000000 08:01 525387 /usr/lib/libperl.so.5.10.1 7fdaae43f000-7fdaae63e000 ---p 00165000 08:01 525387 /usr/lib/libperl.so.5.10.1 7fdaae63e000-7fdaae647000 rw-p 00164000 08:01 525387 /usr/lib/libperl.so.5.10.1 7fdaae647000-7fdaae665000 r-xp 00000000 08:01 393819 /lib/ld-2.11.3.so 7fdaae854000-7fdaae859000 rw-p 00000000 00:00 0 7fdaae862000-7fdaae864000 rw-p 00000000 00:00 0 7fdaae864000-7fdaae865000 r--p 0001d000 08:01 393819 /lib/ld-2.11.3.so 7fdaae865000-7fdaae866000 rw-p 0001e000 08:01 393819 /lib/ld-2.11.3.so 7fdaae866000-7fdaae867000 rw-p 00000000 00:00 0 7fff9616d000-7fff9618e000 rw-p 00000000 00:00 0 [stack] 7fff961ff000-7fff96200000 r-xp 00000000 00:00 0 [vdso] ffffffffff600000-ffffffffff601000 r--p 00000000 00:00 0 [vsyscall] dpkg: error processing /var/cache/apt/archives/libc6_2.11.3-4_amd64.deb (--unpack): subprocess new pre-installation script killed by signal (Aborted) Errors were encountered while processing: /var/cache/apt/archives/libc6_2.11.3-4_amd64.deb

    Read the article

  • Simple object creation with DIY-DI?

    - by Runcible
    I recently ran across this great article by Chad Perry entitled "DIY-DI" or "Do-It-Yourself Dependency Injection". I'm in a position where I'm not yet ready to use a IoC framework, but I want to head in that direction. It seems like DIY-DI is a good first step. However, after reading the article, I'm still a little confused about object creation. Here's a simple example: Using manual constructor dependency injection (not DIY-DI), this is how one must construct a Hotel object: PowerGrid powerGrid; // only one in the entire application WaterSupply waterSupply; // only one in the entire application Staff staff; Rooms rooms; Hotel hotel(staff, rooms, powerGrid, waterSupply); Creating all of these dependency objects makes it difficult to construct the Hotel object in isolation, which means that writing unit tests for Hotel will be difficult. Does using DIY-DI make it easier? What advantage does DIY-DI provide over manual constructor dependency injection?

    Read the article

  • How do I set a dependency on Spring Web Services in my POM.xml

    - by Ben
    I get this on a lot of Maven dependencies, though current source of pain is Spring. I'll set a Spring version and include it like so: <spring-version>3.0.0.RELEASE</spring-version> <!-- Spring framework --> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-core</artifactId> <version>${spring-version}</version> </dependency> Which works as expected. I am however having problems setting my dependency on spring-ws-core for web services. The latest I can find in any repo is 2.0.0-M1. http://mvnrepository.com/artifact/org.springframework.ws/spring-ws-core Any clues on what I need to include in my maven POM to get Spring 3 web services to work :)

    Read the article

  • Implicitly including optional dependencies in Maven

    - by Jon Todd
    I have a project A which has a dependency X. Dependency X has an optional dependency Y which doens't get included in A by default. Is there a way to include Y in my POM without explicitly including it? In Ivy they have a way to essentailly say include all optional dependencies of X, does Maven have a way to do this?

    Read the article

  • Maven eclipse does not add a dependency

    - by Calm Storm
    I have the following snippet in my pom.xml <dependency> <groupId>aspectj</groupId> <artifactId>aspectjrt</artifactId> <version>1.5.3</version> </dependency> and in one of my Java files I refer a class org.aspectj.lang.ProceedingJoinPoint. When I do a "mvn clean install" it compiles and builds fine but when I do an eclipse:eclipse, and import the project in eclipse it gives me an error The import org.aspectj cannot be resolved. I checked the .classpath file that was generated and it does not have an entry to this file. I tried a "mvn dependency:tree" and it lists this fine. Can someone tell me what is going wrong here?

    Read the article

  • Help me finding dependency list.

    - by Pearl
    I have two table employee table and employee dependency table. Employee tooks like below. insert into E values(1,'Adam') insert into E values(2,'Bob') insert into E values(3,'Candy') insert into E values(4,'Doug') insert into E values(5,'Earl') insert into E values(6,'Fran') Employee dependency table looks like below insert into Ed values(3,'2') insert into Ed values(3,'5') insert into Ed values(2,'1') insert into Ed values(2,'4') insert into Ed values(5,'6') I need to find the dependency list like below Eid Ename Dname 3 Candy Bob,Fran Please help me finding the above.

    Read the article

  • How can I make properties in properties files mandatory in Spring?

    - by Paulo Guedes
    I have an ApplicationContext.xml file with the following node: <context:property-placeholder location="classpath:hibernate.properties, classpath:pathConfiguration.properties" /> It specifies that both properties files will be used by my application. Inside pathConfiguration.properties, some paths are defined, such as: PATH_ERROR=/xxx/yyy/error PATH_SUCCESS=/xxx/yyy/success A PathConfiguration bean has setters for each path. The problem is: when some of those mandatory paths are not defined, no error is thrown. How and where should I handle this problem?

    Read the article

  • [N]Hibernate: view-like fetching properties of associated class

    - by chiccodoro
    (Felt quite helpless in formulating an appropriate title...) In my C# app I display a list of "A" objects, along with some properties of their associated "B" objects and properties of B's associated "C" objects: A.Name B.Name B.SomeValue C.Name Foo Bar 123 HelloWorld Bar Hello 432 World ... To clarify: A has an FK to B, B has an FK to C. (Such as, e.g. BankAccount - Person - Company). I have tried two approaches to load these properties from the database (using NHibernate): A fast approach and a clean approach. My eventual question is how to do a fast & clean approach. Fast approach: Define a view in the database which joins A, B, C and provides all these fields. In the A class, define properties "BName", "BSomeValue", "CName" Define a hibernate mapping between A and the View, whereas the needed B and C properties are mapped with update="false" insert="false" and do actually stem from B and C tables, but Hibernate is not aware of that since it uses the view. This way, the listing only loads one object per "A" record, which is quite fast. If the code tries to access the actual associated property, "A.B", I issue another HQL query to get B, set the property and update the faked BName and BSomeValue properties as well. Clean approach: There is no view. Class A is mapped to table A, B to B, C to C. When loading the list of A, I do a double left-join-fetch to get B and C as well: from A a left join fetch a.B left join fetch a.B.C B.Name, B.SomeValue and C.Name are accessed through the eagerly loaded associations. The disadvantage of this approach is that it gets slower and takes more memory, since it needs to created and map 3 objects per "A" record: An A, B, and C object each. Fast and clean approach: I feel somehow uncomfortable using a database view that hides a join and treat that in NHibernate as if it was a table. So I would like to do something like: Have no views in the database. Declare properties "BName", "BSomeValue", "CName" in class "A". Define the mapping for A such that NHibernate fetches A and these properties together using a join SQL query as a database view would do. The mapping should still allow for defining lazy many-to-one associations for getting A.B.C My questions: Is this possible? Is it [un]artful? Is there a better way?

    Read the article

  • (Visual) C++ project dependency analysis

    - by polyglot
    I have a few large projects I am working on in my new place of work, which have a complicated set of statically linked library dependencies between them. The libs number around 40-50 and it's really hard to determine what the structure was initially meant to be, there isn't clear documentation on the full dependency map. What tools would anyone recommend to extract such data? Presumably, in the simplest manner, if did the following: define the set of paths which correspond to library units set all .cpp/.h files within those to belong to those compilation units capture the 1st order #include dependency tree One would have enough information to compose a map - refactor - and recompose the map, until one has created some order. I note that http://www.ndepend.com have something nice but that's exclusively .NET unfortunately. I read something about Doxygen being able accomplish some static dependency analysis with configuration; has anyone ever pressed it into service to accomplish such a task?

    Read the article

  • Injecting a dependancy into a base class

    - by Jamie Dixon
    Hey everyone, I'm on a roll today with questions. I'm starting out with Dependency Injection and am having some trouble injecting a dependency into a base class. I have a BaseController controller which my other controllers inherit from. Inside of this base controller I do a number of checks such as determining if the user has the right privileges to view the current page, checking for the existence of some session variables etc. I have a dependency inside of this base controller that I'd like to inject using Ninject however when I set this up as I would for my other dependencies I'm told by the compiler that: Error 1 'MyProject.Controllers.BaseController' does not contain a constructor that takes 0 argument This makes sense but I'm just not sure how to inject this dependency. Should I be using this pattern of using a base controller at all or should I be doing this in a more efficient/correct way?

    Read the article

  • Bind to several class properties

    - by Polaris
    Hello developers. I have some class with properties firstName and lastName. I want bind TextBlock to concatanation of this two properties. I know that I can create third property that will be return concatanation of these properties. But I dont want to use this approach. Is it possible to Bind TextBlock to two properties. and also I dont want create composite userControl.

    Read the article

  • Creating a dynamic proxy generator with c# – Part 4 – Calling the base method

    - by SeanMcAlinden
    Creating a dynamic proxy generator with c# – Part 1 – Creating the Assembly builder, Module builder and caching mechanism Creating a dynamic proxy generator with c# – Part 2 – Interceptor Design Creating a dynamic proxy generator with c# – Part 3 – Creating the constructors   The plan for calling the base methods from the proxy is to create a private method for each overridden proxy method, this will allow the proxy to use a delegate to simply invoke the private method when required. Quite a few helper classes have been created to make this possible so as usual I would suggest download or viewing the code at http://rapidioc.codeplex.com/. In this post I’m just going to cover the main points for when creating methods. Getting the methods to override The first two notable methods are for getting the methods. private static MethodInfo[] GetMethodsToOverride<TBase>() where TBase : class {     return typeof(TBase).GetMethods().Where(x =>         !methodsToIgnore.Contains(x.Name) &&                              (x.Attributes & MethodAttributes.Final) == 0)         .ToArray(); } private static StringCollection GetMethodsToIgnore() {     return new StringCollection()     {         "ToString",         "GetHashCode",         "Equals",         "GetType"     }; } The GetMethodsToIgnore method string collection contains an array of methods that I don’t want to override. In the GetMethodsToOverride method, you’ll notice a binary AND which is basically saying not to include any methods marked final i.e. not virtual. Creating the MethodInfo for calling the base method This method should hopefully be fairly easy to follow, it’s only function is to create a MethodInfo which points to the correct base method, and with the correct parameters. private static MethodInfo CreateCallBaseMethodInfo<TBase>(MethodInfo method) where TBase : class {     Type[] baseMethodParameterTypes = ParameterHelper.GetParameterTypes(method, method.GetParameters());       return typeof(TBase).GetMethod(        method.Name,        BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic,        null,        baseMethodParameterTypes,        null     ); }   /// <summary> /// Get the parameter types. /// </summary> /// <param name="method">The method.</param> /// <param name="parameters">The parameters.</param> public static Type[] GetParameterTypes(MethodInfo method, ParameterInfo[] parameters) {     Type[] parameterTypesList = Type.EmptyTypes;       if (parameters.Length > 0)     {         parameterTypesList = CreateParametersList(parameters);     }     return parameterTypesList; }   Creating the new private methods for calling the base method The following method outline how I’ve created the private methods for calling the base class method. private static MethodBuilder CreateCallBaseMethodBuilder(TypeBuilder typeBuilder, MethodInfo method) {     string callBaseSuffix = "GetBaseMethod";       if (method.IsGenericMethod || method.IsGenericMethodDefinition)     {                         return MethodHelper.SetUpGenericMethod             (                 typeBuilder,                 method,                 method.Name + callBaseSuffix,                 MethodAttributes.Private | MethodAttributes.HideBySig             );     }     else     {         return MethodHelper.SetupNonGenericMethod             (                 typeBuilder,                 method,                 method.Name + callBaseSuffix,                 MethodAttributes.Private | MethodAttributes.HideBySig             );     } } The CreateCallBaseMethodBuilder is the entry point method for creating the call base method. I’ve added a suffix to the base classes method name to keep it unique. Non Generic Methods Creating a non generic method is fairly simple public static MethodBuilder SetupNonGenericMethod(     TypeBuilder typeBuilder,     MethodInfo method,     string methodName,     MethodAttributes methodAttributes) {     ParameterInfo[] parameters = method.GetParameters();       Type[] parameterTypes = ParameterHelper.GetParameterTypes(method, parameters);       Type returnType = method.ReturnType;       MethodBuilder methodBuilder = CreateMethodBuilder         (             typeBuilder,             method,             methodName,             methodAttributes,             parameterTypes,             returnType         );       ParameterHelper.SetUpParameters(parameterTypes, parameters, methodBuilder);       return methodBuilder; }   private static MethodBuilder CreateMethodBuilder (     TypeBuilder typeBuilder,     MethodInfo method,     string methodName,     MethodAttributes methodAttributes,     Type[] parameterTypes,     Type returnType ) { MethodBuilder methodBuilder = typeBuilder.DefineMethod(methodName, methodAttributes, returnType, parameterTypes); return methodBuilder; } As you can see, you simply have to declare a method builder, get the parameter types, and set the method attributes you want.   Generic Methods Creating generic methods takes a little bit more work. /// <summary> /// Sets up generic method. /// </summary> /// <param name="typeBuilder">The type builder.</param> /// <param name="method">The method.</param> /// <param name="methodName">Name of the method.</param> /// <param name="methodAttributes">The method attributes.</param> public static MethodBuilder SetUpGenericMethod     (         TypeBuilder typeBuilder,         MethodInfo method,         string methodName,         MethodAttributes methodAttributes     ) {     ParameterInfo[] parameters = method.GetParameters();       Type[] parameterTypes = ParameterHelper.GetParameterTypes(method, parameters);       MethodBuilder methodBuilder = typeBuilder.DefineMethod(methodName,         methodAttributes);       Type[] genericArguments = method.GetGenericArguments();       GenericTypeParameterBuilder[] genericTypeParameters =         GetGenericTypeParameters(methodBuilder, genericArguments);       ParameterHelper.SetUpParameterConstraints(parameterTypes, genericTypeParameters);       SetUpReturnType(method, methodBuilder, genericTypeParameters);       if (method.IsGenericMethod)     {         methodBuilder.MakeGenericMethod(genericArguments);     }       ParameterHelper.SetUpParameters(parameterTypes, parameters, methodBuilder);       return methodBuilder; }   private static GenericTypeParameterBuilder[] GetGenericTypeParameters     (         MethodBuilder methodBuilder,         Type[] genericArguments     ) {     return methodBuilder.DefineGenericParameters(GenericsHelper.GetArgumentNames(genericArguments)); }   private static void SetUpReturnType(MethodInfo method, MethodBuilder methodBuilder, GenericTypeParameterBuilder[] genericTypeParameters) {     if (method.IsGenericMethodDefinition)     {         SetUpGenericDefinitionReturnType(method, methodBuilder, genericTypeParameters);     }     else     {         methodBuilder.SetReturnType(method.ReturnType);     } }   private static void SetUpGenericDefinitionReturnType(MethodInfo method, MethodBuilder methodBuilder, GenericTypeParameterBuilder[] genericTypeParameters) {     if (method.ReturnType == null)     {         methodBuilder.SetReturnType(typeof(void));     }     else if (method.ReturnType.IsGenericType)     {         methodBuilder.SetReturnType(genericTypeParameters.Where             (x => x.Name == method.ReturnType.Name).First());     }     else     {         methodBuilder.SetReturnType(method.ReturnType);     }             } Ok, there are a few helper methods missing, basically there is way to much code to put in this post, take a look at the code at http://rapidioc.codeplex.com/ to follow it through completely. Basically though, when dealing with generics there is extra work to do in terms of getting the generic argument types setting up any generic parameter constraints setting up the return type setting up the method as a generic All of the information is easy to get via reflection from the MethodInfo.   Emitting the new private method Emitting the new private method is relatively simple as it’s only function is calling the base method and returning a result if the return type is not void. ILGenerator il = privateMethodBuilder.GetILGenerator();   EmitCallBaseMethod(method, callBaseMethod, il);   private static void EmitCallBaseMethod(MethodInfo method, MethodInfo callBaseMethod, ILGenerator il) {     int privateParameterCount = method.GetParameters().Length;       il.Emit(OpCodes.Ldarg_0);       if (privateParameterCount > 0)     {         for (int arg = 0; arg < privateParameterCount; arg++)         {             il.Emit(OpCodes.Ldarg_S, arg + 1);         }     }       il.Emit(OpCodes.Call, callBaseMethod);       il.Emit(OpCodes.Ret); } So in the main method building method, an ILGenerator is created from the method builder. The ILGenerator performs the following actions: Load the class (this) onto the stack using the hidden argument Ldarg_0. Create an argument on the stack for each of the method parameters (starting at 1 because 0 is the hidden argument) Call the base method using the Opcodes.Call code and the MethodInfo we created earlier. Call return on the method   Conclusion Now we have the private methods prepared for calling the base method, we have reached the last of the relatively easy part of the proxy building. Hopefully, it hasn’t been too hard to follow so far, there is a lot of code so I haven’t been able to post it all so please check it out at http://rapidioc.codeplex.com/. The next section should be up fairly soon, it’s going to cover creating the delegates for calling the private methods created in this post.   Kind Regards, Sean.

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >