Search Results

Search found 13325 results on 533 pages for 'domain transferring'.

Page 420/533 | < Previous Page | 416 417 418 419 420 421 422 423 424 425 426 427  | Next Page >

  • Why is FubuMVC new()ing up my view model in PartialForEach?

    - by Jon M
    I'm getting started with FubuMVC and I have a simple Customer - Order relationship I'm trying to display using nested partials. My domain objects are as follows: public class Customer { private readonly IList<Order> orders = new List<Order>(); public string Name { get; set; } public IEnumerable<Order> Orders { get { return orders; } } public void AddOrder(Order order) { orders.Add(order); } } public class Order { public string Reference { get; set; } } I have the following controller classes: public class CustomersController { public IndexViewModel Index(IndexInputModel inputModel) { var customer1 = new Customer { Name = "John Smith" }; customer1.AddOrder(new Order { Reference = "ABC123" }); return new IndexViewModel { Customers = new[] { customer1 } }; } } public class IndexInputModel { } public class IndexViewModel { public IEnumerable<Customer> Customers { get; set; } } public class IndexView : FubuPage<IndexViewModel> { } public class CustomerPartial : FubuControl<Customer> { } public class OrderPartial : FubuControl<Order> { } IndexView.aspx: (standard html stuff trimmed) <div> <%= this.PartialForEach(x => x.Customers).Using<CustomerPartial>() %> </div> CustomerPartial.ascx: <%@ Control Language="C#" Inherits="FubuDemo.Controllers.Customers.CustomerPartial" %> <div> Customer Name: <%= this.DisplayFor(x => x.Name) %> <br /> Orders: (<%= Model.Orders.Count() %>) <br /> <%= this.PartialForEach(x => x.Orders) %> </div> OrderPartial.ascx: <%@ Control Language="C#" Inherits="FubuDemo.Controllers.Customers.OrderPartial" %> <div> Order <br /> Ref: <%= this.DisplayFor(x => x.Reference) %> </div> When I view Customers/Index, I see the following: Customers Customer Name: John Smith Orders: (1) It seems that in CustomerPartial.ascx, doing Model.Orders.Count() correctly picks up that 1 order exists. However PartialForEach(x = x.Orders) does not, as nothing is rendered for the order. If I set a breakpoint on the Order constructor, I see that it initially gets called by the Index method on CustomersController, but then it gets called by FubuMVC.Core.Models.StandardModelBinder.Bind, so it is getting re-instantiated by FubuMVC and losing the content of the Orders collection. This isn't quite what I'd expect, I would think that PartialForEach would just pass the domain object directly into the partial. Am I missing the point somewhere? What is the 'correct' way to achieve this kind of result in Fubu?

    Read the article

  • If I use a facade class with generic methods to access the JPA API, how should I provide additional processing for specific types?

    - by Shaun
    Let's say I'm making a fairly simple web application using JAVA EE specs (I've heard this is possible). In this app, I only have about 10 domain/data objects, and these are represented by JPA Entities. Architecturally, I would consider the JPA API to perform the role of a DAO. Of course, I don't want to use the EntityManager directly in my UI (JSF) and I need to manage transactions, so I delegate these tasks to the so-called service layer. More specifically, I would like to be able to handle these tasks in a single DataService class (often also called CrudService) with generic methods. See this article by Adam Bien for an example interface: http://www.adam-bien.com/roller/abien/entry/generic_crud_service_aka_dao My project differs from that article in that I can't use EJBs, so my service classes are essentially just named beans and I handle transactions manually. Regardless, what I want is a single interface for simple CRUD operations on my data objects because having a different class for each data type would lead to a lot of duplicate and/or unnecessary code. Ideally, my views would be able to use a method such as public <T> List<T> findAll(Class<T> type) { ... } to retrieve data. Using JSF, it might look something like this: <h:dataTable value="#{dataService.findAll(data.class)}" var="d"> ... </h:dataTable> Similarly, after validating forms, my controller could submit the data with a method such as: public <T> void add(T entity) { ... } Granted, you'd probably actually want to return something useful to the caller. In any case, this works well if your data can be treated as homogenous in this manner. Alas, it breaks down when you need to perform additional processing on certain objects before passing them on to JPA. For example, let's say I'm dealing with Books and Authors which have a many-to-many relationship. Each Book has a set of IDs referring to its authors, and each Author has a set of IDs referring to their books. Normally, JPA can manage this kind of relationship for you, but in some cases it can't (for example, the google app engine JPA provider doesn't support this). Thus, when I persist a new book for example, I may need to update the corresponding author entities. My question, then, is if there's an elegant way to handle this or if I should reconsider the sanity of my whole design. Here's a couple ways I see of dealing with it: The instanceof operator. I could use this to target certain classes when special processing is needed. Perhaps maintainability suffers and it isn't beautiful code, but if there's only 10 or so domain objects it can't be all that bad... could it? Make a different service for each entity type (ie, BookService and AuthorService). All services would inherit from a generic DataService base class and override methods if special processing is needed. At this point, you could probably also just call them DAOs instead. As always, I appreciate the help. Let me know if any clarifications are needed, as I left out many smaller details.

    Read the article

  • spring security : Failed to load ApplicationContext with pre-post-annotations="enabled"

    - by thogau
    I am using spring 3.0.1 + spring-security 3.0.2 and I am trying to use features like @PreAuthorize and @PostFilter annotations. When running in units tests using @RunWith(SpringJUnit4ClassRunner.class) or in a main(String[] args) method my application context fails to start if enable pre-post-annotations and use org.springframework.security.acls.AclPermissionEvaluator : <!-- Enable method level security--> <security:global-method-security pre-post-annotations="enabled"> <security:expression-handler ref="expressionHandler"/> </security:global-method-security> <bean id="expressionHandler" class="org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler"> <property name="permissionEvaluator" ref="aclPermissionEvaluator"/> </bean> <bean id="aclPermissionEvaluator" class="org.springframework.security.acls.AclPermissionEvaluator"> <constructor-arg ref="aclService"/> </bean> <!-- Enable stereotype support --> <context:annotation-config /> <context:component-scan base-package="com.rreps.core" /> <bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer"> <property name="locations"> <list> <value>classpath:applicationContext.properties</value> </list> </property> </bean> <bean id="dataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource"> <property name="driverClass" value="${jdbc.driver}" /> <property name="jdbcUrl" value="${jdbc.url}" /> <property name="user" value="${jdbc.username}" /> <property name="password" value="${jdbc.password}" /> <property name="initialPoolSize" value="10" /> <property name="minPoolSize" value="5" /> <property name="maxPoolSize" value="25" /> <property name="acquireRetryAttempts" value="10" /> <property name="acquireIncrement" value="5" /> <property name="idleConnectionTestPeriod" value="3600" /> <property name="maxIdleTime" value="10800" /> <property name="maxConnectionAge" value="14400" /> <property name="preferredTestQuery" value="SELECT 1;" /> <property name="testConnectionOnCheckin" value="false" /> </bean> <bean id="auditedSessionFactory" class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean"> <property name="dataSource" ref="dataSource" /> <property name="configLocation" value="classpath:hibernate.cfg.xml" /> <property name="hibernateProperties"> <value> hibernate.dialect=${hibernate.dialect} hibernate.query.substitutions=true 'Y', false 'N' hibernate.cache.use_second_level_cache=true hibernate.cache.provider_class=net.sf.ehcache.hibernate.SingletonEhCacheProvider hibernate.hbm2ddl.auto=update hibernate.c3p0.acquire_increment=5 hibernate.c3p0.idle_test_period=3600 hibernate.c3p0.timeout=10800 hibernate.c3p0.max_size=25 hibernate.c3p0.min_size=1 hibernate.show_sql=false hibernate.validator.autoregister_listeners=false </value> </property> <!-- validation is performed by "hand" (see http://opensource.atlassian.com/projects/hibernate/browse/HV-281) <property name="eventListeners"> <map> <entry key="pre-insert" value-ref="beanValidationEventListener" /> <entry key="pre-update" value-ref="beanValidationEventListener" /> </map> </property> --> <property name="entityInterceptor"> <bean class="com.rreps.core.dao.hibernate.interceptor.TrackingInterceptor" /> </property> </bean> <bean id="simpleSessionFactory" class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean"> <property name="dataSource" ref="dataSource" /> <property name="configLocation" value="classpath:hibernate.cfg.xml" /> <property name="hibernateProperties"> <value> hibernate.dialect=${hibernate.dialect} hibernate.query.substitutions=true 'Y', false 'N' hibernate.cache.use_second_level_cache=true hibernate.cache.provider_class=net.sf.ehcache.hibernate.SingletonEhCacheProvider hibernate.hbm2ddl.auto=update hibernate.c3p0.acquire_increment=5 hibernate.c3p0.idle_test_period=3600 hibernate.c3p0.timeout=10800 hibernate.c3p0.max_size=25 hibernate.c3p0.min_size=1 hibernate.show_sql=false hibernate.validator.autoregister_listeners=false </value> </property> <!-- property name="eventListeners"> <map> <entry key="pre-insert" value-ref="beanValidationEventListener" /> <entry key="pre-update" value-ref="beanValidationEventListener" /> </map> </property--> </bean> <bean id="sequenceSessionFactory" class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean"> <property name="dataSource" ref="dataSource" /> <property name="configLocation" value="classpath:hibernate.cfg.xml" /> <property name="hibernateProperties"> <value> hibernate.dialect=${hibernate.dialect} hibernate.query.substitutions=true 'Y', false 'N' hibernate.cache.use_second_level_cache=true hibernate.cache.provider_class=net.sf.ehcache.hibernate.SingletonEhCacheProvider hibernate.hbm2ddl.auto=update hibernate.c3p0.acquire_increment=5 hibernate.c3p0.idle_test_period=3600 hibernate.c3p0.timeout=10800 hibernate.c3p0.max_size=25 hibernate.c3p0.min_size=1 hibernate.show_sql=false hibernate.validator.autoregister_listeners=false </value> </property> </bean> <bean id="validationFactory" class="javax.validation.Validation" factory-method="buildDefaultValidatorFactory" /> <!-- bean id="beanValidationEventListener" class="org.hibernate.cfg.beanvalidation.BeanValidationEventListener"> <constructor-arg index="0" ref="validationFactory" /> <constructor-arg index="1"> <props/> </constructor-arg> </bean--> <!-- Enable @Transactional support --> <tx:annotation-driven transaction-manager="transactionManager"/> <bean id="transactionManager" class="org.springframework.orm.hibernate3.HibernateTransactionManager"> <property name="sessionFactory" ref="auditedSessionFactory" /> </bean> <security:authentication-manager alias="authenticationManager"> <security:authentication-provider user-service-ref="userDetailsService" /> </security:authentication-manager> <bean id="userDetailsService" class="com.rreps.core.service.impl.UserDetailsServiceImpl" /> <!-- ACL stuff --> <bean id="aclCache" class="org.springframework.security.acls.domain.EhCacheBasedAclCache"> <constructor-arg> <bean class="org.springframework.cache.ehcache.EhCacheFactoryBean"> <property name="cacheManager"> <bean class="org.springframework.cache.ehcache.EhCacheManagerFactoryBean"/> </property> <property name="cacheName" value="aclCache"/> </bean> </constructor-arg> </bean> <bean id="lookupStrategy" class="org.springframework.security.acls.jdbc.BasicLookupStrategy"> <constructor-arg ref="dataSource"/> <constructor-arg ref="aclCache"/> <constructor-arg> <bean class="org.springframework.security.acls.domain.AclAuthorizationStrategyImpl"> <constructor-arg> <list> <bean class="org.springframework.security.core.authority.GrantedAuthorityImpl"> <constructor-arg value="ROLE_ADMINISTRATEUR"/> </bean> <bean class="org.springframework.security.core.authority.GrantedAuthorityImpl"> <constructor-arg value="ROLE_ADMINISTRATEUR"/> </bean> <bean class="org.springframework.security.core.authority.GrantedAuthorityImpl"> <constructor-arg value="ROLE_ADMINISTRATEUR"/> </bean> </list> </constructor-arg> </bean> </constructor-arg> <constructor-arg> <bean class="org.springframework.security.acls.domain.ConsoleAuditLogger"/> </constructor-arg> </bean> <bean id="aclService" class="com.rreps.core.service.impl.MysqlJdbcMutableAclService"> <constructor-arg ref="dataSource"/> <constructor-arg ref="lookupStrategy"/> <constructor-arg ref="aclCache"/> </bean> The strange thing is that the context starts normally when deployed in a webapp and @PreAuthorize and @PostFilter annotations are working fine as well... Any idea what is wrong? Here is the end of the stacktrace : ... 55 more Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataSource' defined in class path resource [applicationContext-core.xml]: Initialization of bean failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.transaction.config.internalTransactionAdvisor': Cannot resolve reference to bean 'org.springframework.transaction.annotation.AnnotationTransactionAttributeSource#0' while setting bean property 'transactionAttributeSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.transaction.annotation.AnnotationTransactionAttributeSource#0': Initialization of bean failed; nested exception is java.lang.NullPointerException at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:521) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:450) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:290) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:287) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:189) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322) ... 67 more Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.transaction.config.internalTransactionAdvisor': Cannot resolve reference to bean 'org.springframework.transaction.annotation.AnnotationTransactionAttributeSource#0' while setting bean property 'transactionAttributeSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.transaction.annotation.AnnotationTransactionAttributeSource#0': Initialization of bean failed; nested exception is java.lang.NullPointerException at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1308) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1067) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:511) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:450) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:290) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:287) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193) at org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86) at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100) at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findEligibleAdvisors(AbstractAdvisorAutoProxyCreator.java:86) at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.getAdvicesAndAdvisorsForBean(AbstractAdvisorAutoProxyCreator.java:68) at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.wrapIfNecessary(AbstractAutoProxyCreator.java:359) at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessAfterInitialization(AbstractAutoProxyCreator.java:322) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsAfterInitialization(AbstractAutowireCapableBeanFactory.java:404) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1409) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:513) ... 73 more Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.transaction.annotation.AnnotationTransactionAttributeSource#0': Initialization of bean failed; nested exception is java.lang.NullPointerException at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:521) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:450) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:290) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:287) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:189) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322) ... 91 more Caused by: java.lang.NullPointerException at org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource.getAttributes(DelegatingMethodSecurityMetadataSource.java:52) at org.springframework.security.access.intercept.aopalliance.MethodSecurityMetadataSourceAdvisor$MethodSecurityMetadataSourcePointcut.matches(MethodSecurityMetadataSourceAdvisor.java:129) at org.springframework.aop.support.AopUtils.canApply(AopUtils.java:215) at org.springframework.aop.support.AopUtils.canApply(AopUtils.java:252) at org.springframework.aop.support.AopUtils.findAdvisorsThatCanApply(AopUtils.java:284) at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findAdvisorsThatCanApply(AbstractAdvisorAutoProxyCreator.java:117) at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findEligibleAdvisors(AbstractAdvisorAutoProxyCreator.java:87) at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.getAdvicesAndAdvisorsForBean(AbstractAdvisorAutoProxyCreator.java:68) at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.wrapIfNecessary(AbstractAutoProxyCreator.java:359) at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessAfterInitialization(AbstractAutoProxyCreator.java:322) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsAfterInitialization(AbstractAutowireCapableBeanFactory.java:404) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1409) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:513) ... 97 more

    Read the article

  • Spotlight Infinite Indexing issue (external data drive)

    - by Manca Weeks
    This is an external drive, formerly a boot drive which is now in use only to access music files (sibelius, audio, midi, live, logic etc.) without transferring the data into a new boot system, partly because of the issue I am about to describe, but mostly because the majority of the data is mainly there for archival purposes. The user is a composer and prominent musician and needs to be able to rehash the data at will. I have tried several things - here is a list: - make complete filesystem clone with antonio diaz's ddrescue - run Disk Warrior on copy, repair whatever errors occurred - wipe out all ACLs on entire drive - set all permissions to the same value - wide open 777 - remove any system data (applications, system files, including hidden files to the best of my knowledge) by selecting only non-system/app data and using Carbon Copy Cloner to put only the data of interest onto a newly formatted drive - transfer data to newly formatted drive folder by folder, resetting the spotlight index in between adding each to observe for issues (interesting here is that no issues occurred except for in Documents folder - when I transferred only the Documents folder to a newly formatted drive on its own - no trouble. It appears almost as thought it may not be the content but the quantity or specific combination of data that results in problems) - use DataRescue to transfer the data to yet another newly formatted drive to expose any missed hidden files Between each of the above steps I stopped Spotlight (search for anything beginning with md in Activity Monitor - All Processes and quitting it), deleted the .Spotlight-V100 directory from the affected drive. Restart Splotlight indexing by adding drive to Spotlight privacy list and removing it. In each case the same issue occurs - Spotlight begins indexing normally (or so it seems), then the index estimated time increases, usually to 4 hours remaining. This is where it gets stuck and continues to predict 4 hours remaining but never finishes. Sometimes I can't eject the drive and have to quit the md.. processes from Activity Monitor to be able to eject the drive without Force Eject. Once I disconnect the drive after the 4 hours remaining situation - if I reattach it, Spotlight forever estimates remaining time and never gets going again. So there it is. It is apparently not a filesystem issue, not a permissions issue and not tied to any particular piece of hardware or protocol (used USB and FW drives). I have tried this on several machines (3 to be precise) and in 10.5.8 and 10.6.5. Simply disabling Spotlight on this volume is not an option because the owner has no clue where things are as the data on the volume dates back to music projects and compositions from 2003 and before. He needs to be able to query for results. Anyone got any ideas? ---update 2-6-11 Since I have not received any responses except the one below which appears to misunderstand my point, I am updating this post hoping to get more responses. I have used the terminal command sudo opensnoop -p PID where PID is the mdworker process ID to try and determine what Spotlight is doing and hopefully find the files it's having trouble with. Here's what happens: After indexing for a few hours, mdworker is gone. It no longer shows up in Activity Monitor under "All Processes" and the Terminal window with the opensnoop result stops moving. I then proceeded to execute the same command on mds to see what it was doing and here's what I get, repeatedly: 501 57 mds 21 / 501 57 mds 21 /Volumes/Sno Leppard 501 57 mds 21 /Volumes/Tiger 501 57 mds 21 /Volumes/Leppard 501 57 mds 21 /Volumes/Disk Warrior 501 57 mds 21 /Volumes/ONM Data These represent all the volumes currently mounted in the system. All except ONM Data, which is the one I am trying to index, are excluded from SPotlight indexing at the moment. The sequence above repeats over and over, with slight variation, sometimes skipping one of the volumes. Questions - what happened to mdworker? What is mds doing? I will let this run until tomorrow morning and throughout the day and monitor for any changes. Any input would be very much appreciated. Even if you're not sure what the ultimate answer is, please alert me to anything you think I may be missing. Hopefully at some point we will figure this out... Thanks, M __final edit__ I finally resolved the issue and here is how I did it. I used the terminal command "sudo opensnoop -p PID" where the PID is the process id of the processes I was monitoring. I was looking at all instances of mds and mdworker running in the system. After the third time through indexing the same data set (see info above), I contacted Apple and got to their highest level of support - they were flabbergasted as well. They advised me to install yet another default 10.6.6 system and try again. The same pattern repeated - mds and mdworker(s) would start indexing and eventually the spotlight icon would say 6 hours remaining and all mdworkers were gone, mds at 90% or so of CPU. But I did finally figure out that the first time mdworker stopped like that, the last file it touched was always in the same folder. I excluded that folder from spotlight search and the rest of the data set indexed within about 2 hours with no strange behavior or failures. I copied that folder to another machine and Spotlight barfed immediately. Exclude that folder and all is well again. I have no clue what is causing this behavior, still, but I did find a functional solution to the problem. Anyone with a similar problem - run opensnoop on all instances of mds and mdworker and wait patiently for wdworker to exit. Look at the last file it touched and exclude the enclosing folder from being indexed. I was able to repeat the issue and solution on 2 different installs and 2 different copies of the data set. Hope this helps. If we find an actual cause of the folder being such a problem (it is called MICHAEL BRECKER RECORD SOLOS and contains almost 1 GB of audio related files - performer, live, SD2 - things like that), I will edit again to let you all know. Thanks for ay attempts to help, M

    Read the article

  • Sporadic disk clicking sound

    - by Abdó
    Hi, I'm having some unusual and sporadic hard disk clicking issues. Here is a cronological description of the facts. I'm using an ASUS P6T-SE with Intel Core i7, 6Gb RAM 600W Power supply and ATI4670 graphics, running Ubuntu 10.10. About one month ago my hard disk (SATA II Seagate Barracuda 1Tb 7200 rpm) started making a clicking sound: a sort of loud tic-tac, every second or so, when involved in disk activity. The system was clearly slower than before at disk access, but it was functional and I could not find any signal of trouble on the linux logs. I disconnected the disk and tried an older SATA drive I had around: no problem with it. Then I reconnected the Seagate disk, and the problem was mysteriously gone. Ubuntu booted normally, usual speed, no clicking. A couple of weeks later, the problem reappeared. I tried disconnecting reconnecting (as it somehow solved the problem before) without luck. So, despite it was a rather new drive, I assumed it was a hardware issue, made backups and bought a new drive. The new drive is a SATA II Seagate Barracuda 1.5 Tb 7200 rpm. I installed both drives at the same time, with the intention of transferring my files from on to the other. To my surprise, when I booted the computer with both drives, both started making the clicking sound !! Even worse, I removed the old drive, leaving the unformated new drive connected, and booted from a LiveCD. It kept clicking ! Puzzled by this, I tried both drives on my laptop with a SATA to USB cable. At the moment I connected any of them, they made one or two unusual clicks and immediately stopped doing that and worked normally. The old drive I thought almost dead, was working like a charm as if nothing happened. Then I thought: "ok, it must be the motherboard. Let's try again". So, I reconnected the old drive to the ASUS P6T motherboard (the same cables and SATA port as before), and it worked as if nothing happened ! The problem was gone again. The new 1.5 Tb drive was also working ok: No clicking nor slowdown. So I left the old 1Tb disk connected and kept using the computer daily during 3 weeks, until today it happened again. Now I don't really know what to do or check. I'm not even sure if it is a hardware issue any more ! This is rather annoying as it seems it happens with a period of 2 or 3 weeks and I have no means of forcing it to happen. Does anyone have a clue of what can causes this behaviour or have any suggestions of things I should check when it happens again ? What I did today is checking some SMART parameters Error log: smartctl -l error /dev/sda. No errors Short selftest: smartctl -t short /dev/sda. No errors Disk Health check: smartctl -H /dev/sda. passed And here are the vendor specific parameters (smartctl -A /dev/sda) Which I'm not quite sure how to interpret. === START OF READ SMART DATA SECTION === SMART Attributes Data Structure revision number: 10 Vendor Specific SMART Attributes with Thresholds: ID# ATTRIBUTE_NAME FLAG VALUE WORST THRESH TYPE UPDATED WHEN_FAILED RAW_VALUE 1 Raw_Read_Error_Rate 0x000f 120 099 006 Pre-fail Always - 235962588 3 Spin_Up_Time 0x0003 095 095 000 Pre-fail Always - 0 4 Start_Stop_Count 0x0032 100 100 020 Old_age Always - 187 5 Reallocated_Sector_Ct 0x0033 100 100 036 Pre-fail Always - 0 7 Seek_Error_Rate 0x000f 072 060 030 Pre-fail Always - 16348045 9 Power_On_Hours 0x0032 096 096 000 Old_age Always - 3590 10 Spin_Retry_Count 0x0013 100 100 097 Pre-fail Always - 0 12 Power_Cycle_Count 0x0032 100 100 020 Old_age Always - 94 183 Runtime_Bad_Block 0x0032 100 100 000 Old_age Always - 0 184 End-to-End_Error 0x0032 100 100 099 Old_age Always - 0 187 Reported_Uncorrect 0x0032 100 100 000 Old_age Always - 0 188 Command_Timeout 0x0032 100 097 000 Old_age Always - 4295164029 189 High_Fly_Writes 0x003a 100 100 000 Old_age Always - 0 190 Airflow_Temperature_Cel 0x0022 070 057 045 Old_age Always - 30 (Lifetime Min/Max 19/31) 194 Temperature_Celsius 0x0022 030 043 000 Old_age Always - 30 (0 18 0 0) 195 Hardware_ECC_Recovered 0x001a 037 026 000 Old_age Always - 235962588 197 Current_Pending_Sector 0x0012 100 100 000 Old_age Always - 0 198 Offline_Uncorrectable 0x0010 100 100 000 Old_age Offline - 0 199 UDMA_CRC_Error_Count 0x003e 200 200 000 Old_age Always - 0 240 Head_Flying_Hours 0x0000 100 253 000 Old_age Offline - 73950746906346 241 Total_LBAs_Written 0x0000 100 253 000 Old_age Offline - 1832967731 242 Total_LBAs_Read 0x0000 100 253 000 Old_age Offline - 3294986902 Any clue to this mystery will be really welcome. Thank you very much !!

    Read the article

  • "type" Command Not Working As Expected on Git Bash

    - by trysis
    The type command, in Linux, returns the location, on the filesystem, of the given file, if it is in the current folder or the $PATH. This functionality is also available through Windows with the Git Bash command line program. The command also returns a file's location given the file without its extension (.exe, .vbs, etc.) However, I have run into what seems like a strange corner case where the file exists on the $PATH but doesn't get returned using the command. I am thinking of buying a new computer soon, so I looked up the method of transferring the license key from one computer to another, in preparation for actually doing this. The method I found mentioned the files slmgr.vbs and slui.exe, both of which reside in the C:/Windows\System32 folder, which is in my $PATH, as usual for a Windows computer. However, these two files aren't showing up when I use the type command. Also, neither gets executed when I call the files as commands without their extensions in Git Bash, and only slmgr.vbs gets executed when I call them with the extensions. Finally, slmgr.vbs is shown when listing the folder's contents in Git Bash, as well, but slui.exe isn't. I thought this might have to do with permissions, and, indeed, both files have very restrictive permissions, as you can see in the pictures below, but they both have the same permissions, which wouldn't explain why one gets executed and the other doesn't when called directly, nor why one file is listed on command line but the other isn't. C:\Windows\System32 folder, proving the files exist: File permissions for the Users and Administrators groups for the two files (they are identical): And the folder: type command and its output in Git Bash for the 2 files, plus listing the files in the folder (using grep to filter as the folder is huge), as well as listing part of the $PATH (keep in mind, for all these, that Git Bash changes the paths as they are displayed): Sean@MYPC ~ $ type -a slmgr sh.exe": type: slmgr: not found Sean@MYPC ~ $ type -a slmgr.vbs sh.exe": type: slmgr.vbs: not found Sean@MYPC ~ $ type -a slui sh.exe": type: slui: not found Sean@MYPC ~ $ type -a slui.exe sh.exe": type: slui.exe: not found Sean@MYPC ~ $ slmgr sh.exe": slmgr: command not found Sean@MYPC ~ $ slmgr.vbs /c/WINDOWS/system32/slmgr.vbs: line 2: syntax error near unexpected token `(' /c/WINDOWS/system32/slmgr.vbs: line 2: `' Copyright (c) Microsoft Corporation. A ll rights reserved.' Sean@MYPC ~ $ slui sh.exe": slui: command not found Sean@MYPC ~ $ slui.exe sh.exe": slui.exe: command not found Sean@MYPC ~ $ ls /c/Windows/System32/slui.exe /c/Windows/System32/slmgr.vbs ls: /c/Windows/System32/slui.exe: No such file or directory /c/Windows/System32/slmgr.vbs Sean@MYPC ~ $ echo $PATH /c/Users/Sean/bin:.:/usr/local/bin:/mingw/bin:/bin:/cmd:/c/Python33/:/c/Program Files (x86)/Intel/iCLS Client/:/c/Program Files/Intel/iCLS Client/:/c/WINDOWS/sy stem32:/c/WINDOWS:/c/WINDOWS/System32/Wbem:/c/WINDOWS/System32/WindowsPowerShell /v1.0/:/c/Program Files/Intel/Intel(R) Management Engine Components/DAL:/c/Progr am Files/Intel/Intel(R) Management Engine Components/IPT:/c/Program Files (x86)/ Intel/Intel(R) Management Engine Components/DAL:/c/Program Files (x86)/Intel/Int el(R) Management Engine Components/IPT:/c/Program Files/Intel/WiFi/bin/:/c/Progr am Files/Common Files/Intel/WirelessCommon/:/c/strawberry/c/bin:/c/strawberry/pe rl/site/bin:/c/strawberry/perl/bin:/c/Program Files (x86)/Microsoft ASP.NET/ASP. NET Web Pages/v1.0/:/c/Program Files/Microsoft SQL Server/110/Tools/Binn/:/c/Pro gram Files (x86)/Microsoft SQL Server/90/Tools/binn/:/c/Program Files (x86)/Open AFS/Common:/c/HashiCorp/Vagrant/bin:/c/Program Files (x86)/Windows Kits/8.1/Wind ows Performance Toolkit/:/c/Program Files/nodejs/:/c/Program Files (x86)/Git/cmd :/c/Program Files (x86)/Git/bin:/c/Program Files/Microsoft/Web Platform Installe r/:/c/Ruby200-x64/bin:/c/Users/Sean/AppData/Local/Box/Box Edit/:/c/Program Files (x86)/SSH Communications Security/SSH Secure Shell:/c/Users/Sean/Documents/Lisp :/c/Program Files/GCL-2.6.1/lib/gcl-2.6.1/unixport:/c/Chocolatey/bin:/c/Users/Se an/AppData/Roaming/npm:/c/wamp/bin/mysql/mysql5.6.12/bin:/c/Program Files/Oracle /VirtualBox:/c/Program Files/Java/jdk1.7.0_51/bin:/c/Program Files/Node-Growl:/c /chocolatey/bin:/c/Program Files/eclipse:/c/MongoDB/bin:/c/Program Files/7-Zip:/ c/Program Files (x86)/Google/Chrome/Application:/c/Program Files (x86)/LibreOffi ce 4/program:/c/Program Files (x86)/OpenOffice 4/program What's happening? Why aren't these files listed with the type command? Is this issue because of weird Windows permissions, or something even weirder? If permissions, why do they seem to have the same permissions, yet both are not handled in the same way?

    Read the article

  • How to overcome shortcomings in reporting from EAV database?

    - by David Archer
    The major shortcomings with Entity-Attribute-Value database designs in SQL all seem to be related to being able to query and report on the data efficiently and quickly. Most of the information I read on the subject warn against implementing EAV due to these problems and the commonality of querying/reporting for almost all applications. I am currently designing a system where almost all the fields necessary for data storage are not known at design/compile time and are defined by the end-user of the system. EAV seems like a good fit for this requirement but due to the problems I've read about, I am hesitant in implementing it as there are also some pretty heavy reporting requirements for this system as well. I think I've come up with a way around this but would like to pose the question to the SO community. Given that typical normalized database (OLTP) still isn't always the best option for running reports, a good practice seems to be having a "reporting" database (OLAP) where the data from the normalized database is copied to, indexed extensively, and possibly denormalized for easier querying. Could the same idea be used to work around the shortcomings of an EAV design? The main downside I see are the increased complexity of transferring the data from the EAV database to reporting as you may end up having to alter the tables in the reporting database as new fields are defined in the EAV database. But that is hardly impossible and seems to be an acceptable tradeoff for the increased flexibility given by the EAV design. This downside also exists if I use a non-SQL data store (i.e. CouchDB or similar) for the main data storage since all the standard reporting tools are expecting a SQL backend to query against. Do the issues with EAV systems mostly go away if you have a seperate reporting database for querying? EDIT: Thanks for the comments so far. One of the important things about the system I'm working on it that I'm really only talking about using EAV for one of the entities, not everything in the system. The whole gist of the system is to be able to pull data from multiple disparate sources that are not known ahead of time and crunch the data to come up with some "best known" data about a particular entity. So every "field" I'm dealing with is multi-valued and I'm also required to track history for each. The normalized design for this ends up being 1 table per field which makes querying it kind of painful anyway. Here are the table schemas and sample data I'm looking at (obviously changed from what I'm working on but I think it illustrates the point well): EAV Tables Person ------------------- - Id - Name - ------------------- - 123 - Joe Smith - ------------------- Person_Value ------------------------------------------------------------------- - PersonId - Source - Field - Value - EffectiveDate - ------------------------------------------------------------------- - 123 - CIA - HomeAddress - 123 Cherry Ln - 2010-03-26 - - 123 - DMV - HomeAddress - 561 Stoney Rd - 2010-02-15 - - 123 - FBI - HomeAddress - 676 Lancas Dr - 2010-03-01 - ------------------------------------------------------------------- Reporting Table Person_Denormalized ---------------------------------------------------------------------------------------- - Id - Name - HomeAddress - HomeAddress_Confidence - HomeAddress_EffectiveDate - ---------------------------------------------------------------------------------------- - 123 - Joe Smith - 123 Cherry Ln - 0.713 - 2010-03-26 - ---------------------------------------------------------------------------------------- Normalized Design Person ------------------- - Id - Name - ------------------- - 123 - Joe Smith - ------------------- Person_HomeAddress ------------------------------------------------------ - PersonId - Source - Value - Effective Date - ------------------------------------------------------ - 123 - CIA - 123 Cherry Ln - 2010-03-26 - - 123 - DMV - 561 Stoney Rd - 2010-02-15 - - 123 - FBI - 676 Lancas Dr - 2010-03-01 - ------------------------------------------------------ The "Confidence" field here is generated using logic that cannot be expressed easily (if at all) using SQL so my most common operation besides inserting new values will be pulling ALL data about a person for all fields so I can generate the record for the reporting table. This is actually easier in the EAV model as I can do a single query. In the normalized design, I end up having to do 1 query per field to avoid a massive cartesian product from joining them all together.

    Read the article

  • Openfire and LDAP issues

    - by clsmith
    Thanks in advance for the help. Has anyone see this issue with openfire? Currently I use Openfire Fedora with Auth using windows 2003 and also use mysql for the database. When I bring up two clients and talk to each other the time is slow between messages. Sometimes it can take between 5-15 minutes for something sent to get to the person (this is with only two people on the openfire server). I ran a tcp dump using port 389 and see that the machine is running thousands of queries against ldap. When i plug it into wireshark I notice that it is transferring the entire contact list or checking on the status of the entire contact list ? When I run debug on openfire itself I am presented with only this small message in the log: 2010.06.08 07:01:17 LdapManager: Starting LDAP search... 2010.06.08 07:01:17 LdapManager: ... search finished 2010.06.08 07:01:17 LdapManager: Creating a DirContext in LdapManager.getContext()... 2010.06.08 07:01:17 LdapManager: Created hashtable with context values, attempting to create context... 2010.06.08 07:01:17 LdapManager: ... context created successfully, returning. 2010.06.08 07:01:17 LdapManager: Trying to find a groups's DN based on it's groupname. cn: Spark agents CLT, Base DN: OU="Hidden",DC="Hidden",DC="net"... 2010.06.08 07:01:17 LdapManager: Creating a DirContext in LdapManager.getContext()... 2010.06.08 07:01:17 LdapManager: Created hashtable with context values, attempting to create context... 2010.06.08 07:01:17 LdapManager: ... context created successfully, returning. 2010.06.08 07:01:17 LdapManager: Starting LDAP search... 2010.06.08 07:01:17 LdapManager: ... search finished 2010.06.08 07:01:17 LdapManager: Trying to find a groups's DN based on it's groupname. cn: Spark agents CLT, Base DN: OU="Hidden",DC="Hidden",DC="net"... 2010.06.08 07:01:17 LdapManager: Creating a DirContext in LdapManager.getContext()... 2010.06.08 07:01:17 LdapManager: Created hashtable with context values, attempting to create context... 2010.06.08 07:01:17 LdapManager: ... context created successfully, returning. 2010.06.08 07:01:17 LdapManager: Starting LDAP search... 2010.06.08 07:01:17 LdapManager: ... search finished I thought this was a configuration on my end and started to look into the cache settings on the openfire webpages. I tweaked the settings as recommend by the pages and still get the same issues. I doesnt seem to cache the contact list or this might be a feature never fixed or implemented. Has anyone gone through this before ? I have searched online and I see others have great experience with openfire with no issues like I have, or is it because noone checked the queries ? For the time being I created a new Domain Controller and moved openfire to that computer so it can run local queries. This seems to help reduce the speed alot, but when I run the server performance manager tool I see that with two people only using that openfire server I run 593.7 request per second. Thanks for your help, if I didnt provide enough data please let me know what you need and I can find it.

    Read the article

  • error about ACPI _OSC request failed (AE_NOT_FOUND)

    - by Yavuz Maslak
    I have ubuntu server 11.10 64 bit I see an error in kernel.log. This error comes out when the server reboot. some port of grep APCI in kernel.log; Dec 5 09:08:51 www kernel: [ 0.588605] pci0000:00: Requesting ACPI _OSC control (0x1d) Dec 5 09:08:51 www kernel: [ 0.588667] pci0000:00: ACPI _OSC request failed (AE_NOT_FOUND), returned control mask: 0x1d Dec 5 09:08:51 www kernel: [ 0.588746] ACPI _OSC control for PCIe not granted, disabling ASPM Which hardware may be cause this error ? root@www:# grep -r ACPI /var/log/kern.log Dec 5 09:08:51 www kernel: [ 0.000000] BIOS-e820: 00000000bf780000 - 00000000bf798000 (ACPI data) Dec 5 09:08:51 www kernel: [ 0.000000] BIOS-e820: 00000000bf798000 - 00000000bf7dc000 (ACPI NVS) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: RSDP 00000000000fb1a0 00014 (v00 ACPIAM) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: RSDT 00000000bf780000 00040 (v01 022410 RSDT1405 20100224 MSFT 00000097) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: FACP 00000000bf780200 00084 (v01 022410 FACP1405 20100224 MSFT 00000097) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: DSDT 00000000bf7804b0 0C359 (v01 A1279 A1279001 00000001 INTL 20060113) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: FACS 00000000bf798000 00040 Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: APIC 00000000bf780390 000D8 (v01 022410 APIC1405 20100224 MSFT 00000097) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: MCFG 00000000bf780470 0003C (v01 022410 OEMMCFG 20100224 MSFT 00000097) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: OEMB 00000000bf798040 00072 (v01 022410 OEMB1405 20100224 MSFT 00000097) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: HPET 00000000bf78f4b0 00038 (v01 022410 OEMHPET 20100224 MSFT 00000097) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: OSFR 00000000bf78f4f0 000B0 (v01 022410 OEMOSFR 20100224 MSFT 00000097) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: SSDT 00000000bf798fe0 00363 (v01 DpgPmm CpuPm 00000012 INTL 20060113) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: Local APIC address 0xfee00000 Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: PM-Timer IO Port: 0x808 Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: Local APIC address 0xfee00000 Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x01] lapic_id[0x00] enabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x02] lapic_id[0x02] enabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x03] lapic_id[0x04] enabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x04] lapic_id[0x06] enabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x05] lapic_id[0x84] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x06] lapic_id[0x85] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x07] lapic_id[0x86] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x08] lapic_id[0x87] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x09] lapic_id[0x88] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0a] lapic_id[0x89] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0b] lapic_id[0x8a] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0c] lapic_id[0x8b] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0d] lapic_id[0x8c] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0e] lapic_id[0x8d] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0f] lapic_id[0x8e] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x10] lapic_id[0x8f] disabled) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: IOAPIC (id[0x01] address[0xfec00000] gsi_base[0]) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: IOAPIC (id[0x03] address[0xfec8a000] gsi_base[24]) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: IRQ0 used by override. Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: IRQ2 used by override. Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: IRQ9 used by override. Dec 5 09:08:51 www kernel: [ 0.000000] Using ACPI (MADT) for SMP configuration information Dec 5 09:08:51 www kernel: [ 0.000000] ACPI: HPET id: 0x8086a301 base: 0xfed00000 Dec 5 09:08:51 www kernel: [ 0.009507] ACPI: Core revision 20110413 Dec 5 09:08:51 www kernel: [ 0.499129] PM: Registering ACPI NVS region at bf798000 (278528 bytes) Dec 5 09:08:51 www kernel: [ 0.500749] ACPI: bus type pci registered Dec 5 09:08:51 www kernel: [ 0.502747] ACPI: EC: Look up EC in DSDT Dec 5 09:08:51 www kernel: [ 0.503788] ACPI: Executed 1 blocks of module-level executable AML code Dec 5 09:08:51 www kernel: [ 0.520435] ACPI: SSDT 00000000bf7980c0 00F20 (v01 DpgPmm P001Ist 00000011 INTL 20060113) Dec 5 09:08:51 www kernel: [ 0.520863] ACPI: Dynamic OEM Table Load: Dec 5 09:08:51 www kernel: [ 0.520990] ACPI: SSDT (null) 00F20 (v01 DpgPmm P001Ist 00000011 INTL 20060113) Dec 5 09:08:51 www kernel: [ 0.521308] ACPI: Interpreter enabled Dec 5 09:08:51 www kernel: [ 0.521366] ACPI: (supports S0 S1 S3 S4 S5) Dec 5 09:08:51 www kernel: [ 0.521611] ACPI: Using IOAPIC for interrupt routing Dec 5 09:08:51 www kernel: [ 0.522622] PCI: MMCONFIG at [mem 0xe0000000-0xefffffff] reserved in ACPI motherboard resources Dec 5 09:08:51 www kernel: [ 0.554150] ACPI: No dock devices found. Dec 5 09:08:51 www kernel: [ 0.554267] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 5 09:08:51 www kernel: [ 0.555231] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 5 09:08:51 www kernel: [ 0.588224] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0._PRT] Dec 5 09:08:51 www kernel: [ 0.588398] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P1._PRT] Dec 5 09:08:51 www kernel: [ 0.588451] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P4._PRT] Dec 5 09:08:51 www kernel: [ 0.588473] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P6._PRT] Dec 5 09:08:51 www kernel: [ 0.588492] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P7._PRT] Dec 5 09:08:51 www kernel: [ 0.588512] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P8._PRT] Dec 5 09:08:51 www kernel: [ 0.588540] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.NPE1._PRT] Dec 5 09:08:51 www kernel: [ 0.588559] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.NPE3._PRT] Dec 5 09:08:51 www kernel: [ 0.588579] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.NPE7._PRT] Dec 5 09:08:51 www kernel: [ 0.588605] pci0000:00: Requesting ACPI _OSC control (0x1d) Dec 5 09:08:51 www kernel: [ 0.588667] pci0000:00: ACPI _OSC request failed (AE_NOT_FOUND), returned control mask: 0x1d Dec 5 09:08:51 www kernel: [ 0.588746] ACPI _OSC control for PCIe not granted, disabling ASPM Dec 5 09:08:51 www kernel: [ 0.597666] ACPI: PCI Interrupt Link [LNKA] (IRQs 3 4 6 7 10 11 12 14 *15) Dec 5 09:08:51 www kernel: [ 0.598142] ACPI: PCI Interrupt Link [LNKB] (IRQs *5) Dec 5 09:08:51 www kernel: [ 0.598336] ACPI: PCI Interrupt Link [LNKC] (IRQs 3 4 6 7 10 *11 12 14 15) Dec 5 09:08:51 www kernel: [ 0.598810] ACPI: PCI Interrupt Link [LNKD] (IRQs 3 4 6 7 *10 11 12 14 15) Dec 5 09:08:51 www kernel: [ 0.599284] ACPI: PCI Interrupt Link [LNKE] (IRQs 3 4 6 7 10 11 12 *14 15) Dec 5 09:08:51 www kernel: [ 0.599762] ACPI: PCI Interrupt Link [LNKF] (IRQs *3 4 6 7 10 11 12 14 15) Dec 5 09:08:51 www kernel: [ 0.600236] ACPI: PCI Interrupt Link [LNKG] (IRQs 3 4 6 *7 10 11 12 14 15) Dec 5 09:08:51 www kernel: [ 0.600709] ACPI: PCI Interrupt Link [LNKH] (IRQs 3 *4 6 7 10 11 12 14 15) Dec 5 09:08:51 www kernel: [ 0.601931] PCI: Using ACPI for IRQ routing Dec 5 09:08:51 www kernel: [ 0.628146] pnp: PnP ACPI init Dec 5 09:08:51 www kernel: [ 0.628211] ACPI: bus type pnp registered Dec 5 09:08:51 www kernel: [ 0.628417] pnp 00:00: Plug and Play ACPI device, IDs PNP0a08 PNP0a03 (active) Dec 5 09:08:51 www kernel: [ 0.628859] system 00:01: Plug and Play ACPI device, IDs PNP0c01 (active) Dec 5 09:08:51 www kernel: [ 0.628915] pnp 00:02: Plug and Play ACPI device, IDs PNP0200 (active) Dec 5 09:08:51 www kernel: [ 0.628951] pnp 00:03: Plug and Play ACPI device, IDs PNP0b00 (active) Dec 5 09:08:51 www kernel: [ 0.628975] pnp 00:04: Plug and Play ACPI device, IDs PNP0800 (active) Dec 5 09:08:51 www kernel: [ 0.629004] pnp 00:05: Plug and Play ACPI device, IDs PNP0c04 (active) Dec 5 09:08:51 www kernel: [ 0.629229] system 00:06: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 5 09:08:51 www kernel: [ 0.629779] system 00:07: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 5 09:08:51 www kernel: [ 0.629849] pnp 00:08: Plug and Play ACPI device, IDs PNP0103 (active) Dec 5 09:08:51 www kernel: [ 0.629901] pnp 00:09: Plug and Play ACPI device, IDs INT0800 (active) Dec 5 09:08:51 www kernel: [ 0.630030] system 00:0a: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 5 09:08:51 www kernel: [ 0.630254] system 00:0b: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 5 09:08:51 www kernel: [ 0.630304] pnp 00:0c: Plug and Play ACPI device, IDs PNP0303 PNP030b (active) Dec 5 09:08:51 www kernel: [ 0.630359] pnp 00:0d: Plug and Play ACPI device, IDs PNP0f03 PNP0f13 (active) Dec 5 09:08:51 www kernel: [ 0.630492] system 00:0e: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 5 09:08:51 www kernel: [ 0.630986] system 00:0f: Plug and Play ACPI device, IDs PNP0c01 (active) Dec 5 09:08:51 www kernel: [ 0.631078] pnp: PnP ACPI: found 16 devices Dec 5 09:08:51 www kernel: [ 0.631135] ACPI: ACPI bus type pnp unregistered Dec 5 09:08:51 www kernel: [ 0.726291] ACPI: Power Button [PWRB] Dec 5 09:08:51 www kernel: [ 0.726452] ACPI: Power Button [PWRF] Dec 5 09:08:51 www kernel: [ 0.726527] ACPI: acpi_idle yielding to intel_idle Dec 7 21:45:22 www kernel: [ 0.000000] BIOS-e820: 00000000bf780000 - 00000000bf798000 (ACPI data) Dec 7 21:45:22 www kernel: [ 0.000000] BIOS-e820: 00000000bf798000 - 00000000bf7dc000 (ACPI NVS) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: RSDP 00000000000fb1a0 00014 (v00 ACPIAM) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: RSDT 00000000bf780000 00040 (v01 022410 RSDT1405 20100224 MSFT 00000097) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: FACP 00000000bf780200 00084 (v01 022410 FACP1405 20100224 MSFT 00000097) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: DSDT 00000000bf7804b0 0C359 (v01 A1279 A1279001 00000001 INTL 20060113) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: FACS 00000000bf798000 00040 Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: APIC 00000000bf780390 000D8 (v01 022410 APIC1405 20100224 MSFT 00000097) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: MCFG 00000000bf780470 0003C (v01 022410 OEMMCFG 20100224 MSFT 00000097) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: OEMB 00000000bf798040 00072 (v01 022410 OEMB1405 20100224 MSFT 00000097) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: HPET 00000000bf78f4b0 00038 (v01 022410 OEMHPET 20100224 MSFT 00000097) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: OSFR 00000000bf78f4f0 000B0 (v01 022410 OEMOSFR 20100224 MSFT 00000097) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: SSDT 00000000bf798fe0 00363 (v01 DpgPmm CpuPm 00000012 INTL 20060113) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: Local APIC address 0xfee00000 Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: PM-Timer IO Port: 0x808 Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: Local APIC address 0xfee00000 Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x01] lapic_id[0x00] enabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x02] lapic_id[0x02] enabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x03] lapic_id[0x04] enabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x04] lapic_id[0x06] enabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x05] lapic_id[0x84] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x06] lapic_id[0x85] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x07] lapic_id[0x86] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x08] lapic_id[0x87] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x09] lapic_id[0x88] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0a] lapic_id[0x89] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0b] lapic_id[0x8a] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0c] lapic_id[0x8b] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0d] lapic_id[0x8c] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0e] lapic_id[0x8d] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x0f] lapic_id[0x8e] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: LAPIC (acpi_id[0x10] lapic_id[0x8f] disabled) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: IOAPIC (id[0x01] address[0xfec00000] gsi_base[0]) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: IOAPIC (id[0x03] address[0xfec8a000] gsi_base[24]) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: IRQ0 used by override. Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: IRQ2 used by override. Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: IRQ9 used by override. Dec 7 21:45:22 www kernel: [ 0.000000] Using ACPI (MADT) for SMP configuration information Dec 7 21:45:22 www kernel: [ 0.000000] ACPI: HPET id: 0x8086a301 base: 0xfed00000 Dec 7 21:45:22 www kernel: [ 0.009505] ACPI: Core revision 20110413 Dec 7 21:45:22 www kernel: [ 0.499203] PM: Registering ACPI NVS region at bf798000 (278528 bytes) Dec 7 21:45:22 www kernel: [ 0.500819] ACPI: bus type pci registered Dec 7 21:45:22 www kernel: [ 0.503121] ACPI: EC: Look up EC in DSDT Dec 7 21:45:22 www kernel: [ 0.504162] ACPI: Executed 1 blocks of module-level executable AML code Dec 7 21:45:22 www kernel: [ 0.520821] ACPI: SSDT 00000000bf7980c0 00F20 (v01 DpgPmm P001Ist 00000011 INTL 20060113) Dec 7 21:45:22 www kernel: [ 0.521247] ACPI: Dynamic OEM Table Load: Dec 7 21:45:22 www kernel: [ 0.521374] ACPI: SSDT (null) 00F20 (v01 DpgPmm P001Ist 00000011 INTL 20060113) Dec 7 21:45:22 www kernel: [ 0.521691] ACPI: Interpreter enabled Dec 7 21:45:22 www kernel: [ 0.521748] ACPI: (supports S0 S1 S3 S4 S5) Dec 7 21:45:22 www kernel: [ 0.521993] ACPI: Using IOAPIC for interrupt routing Dec 7 21:45:22 www kernel: [ 0.523002] PCI: MMCONFIG at [mem 0xe0000000-0xefffffff] reserved in ACPI motherboard resources Dec 7 21:45:22 www kernel: [ 0.554533] ACPI: No dock devices found. Dec 7 21:45:22 www kernel: [ 0.554649] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 7 21:45:22 www kernel: [ 0.555620] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 7 21:45:22 www kernel: [ 0.588224] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0._PRT] Dec 7 21:45:22 www kernel: [ 0.588398] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P1._PRT] Dec 7 21:45:22 www kernel: [ 0.588451] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P4._PRT] Dec 7 21:45:22 www kernel: [ 0.588473] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P6._PRT] Dec 7 21:45:22 www kernel: [ 0.588492] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P7._PRT] Dec 7 21:45:22 www kernel: [ 0.588512] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.P0P8._PRT] Dec 7 21:45:22 www kernel: [ 0.588540] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.NPE1._PRT] Dec 7 21:45:22 www kernel: [ 0.588559] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.NPE3._PRT] Dec 7 21:45:22 www kernel: [ 0.588579] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.NPE7._PRT] Dec 7 21:45:22 www kernel: [ 0.588606] pci0000:00: Requesting ACPI _OSC control (0x1d) Dec 7 21:45:22 www kernel: [ 0.588667] pci0000:00: ACPI _OSC request failed (AE_NOT_FOUND), returned control mask: 0x1d Dec 7 21:45:22 www kernel: [ 0.588746] ACPI _OSC control for PCIe not granted, disabling ASPM Dec 7 21:45:22 www kernel: [ 0.597661] ACPI: PCI Interrupt Link [LNKA] (IRQs 3 4 6 7 10 11 12 14 *15) Dec 7 21:45:22 www kernel: [ 0.598137] ACPI: PCI Interrupt Link [LNKB] (IRQs *5) Dec 7 21:45:22 www kernel: [ 0.598331] ACPI: PCI Interrupt Link [LNKC] (IRQs 3 4 6 7 10 *11 12 14 15) Dec 7 21:45:22 www kernel: [ 0.598804] ACPI: PCI Interrupt Link [LNKD] (IRQs 3 4 6 7 *10 11 12 14 15) Dec 7 21:45:22 www kernel: [ 0.599278] ACPI: PCI Interrupt Link [LNKE] (IRQs 3 4 6 7 10 11 12 *14 15) Dec 7 21:45:22 www kernel: [ 0.599756] ACPI: PCI Interrupt Link [LNKF] (IRQs *3 4 6 7 10 11 12 14 15) Dec 7 21:45:22 www kernel: [ 0.600230] ACPI: PCI Interrupt Link [LNKG] (IRQs 3 4 6 *7 10 11 12 14 15) Dec 7 21:45:22 www kernel: [ 0.600704] ACPI: PCI Interrupt Link [LNKH] (IRQs 3 *4 6 7 10 11 12 14 15) Dec 7 21:45:22 www kernel: [ 0.601926] PCI: Using ACPI for IRQ routing Dec 7 21:45:22 www kernel: [ 0.624115] pnp: PnP ACPI init Dec 7 21:45:22 www kernel: [ 0.624179] ACPI: bus type pnp registered Dec 7 21:45:22 www kernel: [ 0.624382] pnp 00:00: Plug and Play ACPI device, IDs PNP0a08 PNP0a03 (active) Dec 7 21:45:22 www kernel: [ 0.624821] system 00:01: Plug and Play ACPI device, IDs PNP0c01 (active) Dec 7 21:45:22 www kernel: [ 0.624875] pnp 00:02: Plug and Play ACPI device, IDs PNP0200 (active) Dec 7 21:45:22 www kernel: [ 0.624911] pnp 00:03: Plug and Play ACPI device, IDs PNP0b00 (active) Dec 7 21:45:22 www kernel: [ 0.624933] pnp 00:04: Plug and Play ACPI device, IDs PNP0800 (active) Dec 7 21:45:22 www kernel: [ 0.624962] pnp 00:05: Plug and Play ACPI device, IDs PNP0c04 (active) Dec 7 21:45:22 www kernel: [ 0.625186] system 00:06: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 7 21:45:22 www kernel: [ 0.625733] system 00:07: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 7 21:45:22 www kernel: [ 0.625803] pnp 00:08: Plug and Play ACPI device, IDs PNP0103 (active) Dec 7 21:45:22 www kernel: [ 0.625856] pnp 00:09: Plug and Play ACPI device, IDs INT0800 (active) Dec 7 21:45:22 www kernel: [ 0.625984] system 00:0a: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 7 21:45:22 www kernel: [ 0.626206] system 00:0b: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 7 21:45:22 www kernel: [ 0.626256] pnp 00:0c: Plug and Play ACPI device, IDs PNP0303 PNP030b (active) Dec 7 21:45:22 www kernel: [ 0.626312] pnp 00:0d: Plug and Play ACPI device, IDs PNP0f03 PNP0f13 (active) Dec 7 21:45:22 www kernel: [ 0.626445] system 00:0e: Plug and Play ACPI device, IDs PNP0c02 (active) Dec 7 21:45:22 www kernel: [ 0.626936] system 00:0f: Plug and Play ACPI device, IDs PNP0c01 (active) Dec 7 21:45:22 www kernel: [ 0.627027] pnp: PnP ACPI: found 16 devices Dec 7 21:45:22 www kernel: [ 0.627084] ACPI: ACPI bus type pnp unregistered Dec 7 21:45:22 www kernel: [ 0.722086] ACPI: Power Button [PWRB] Dec 7 21:45:22 www kernel: [ 0.722246] ACPI: Power Button [PWRF] Dec 7 21:45:22 www kernel: [ 0.722320] ACPI: acpi_idle yielding to intel_idle

    Read the article

  • Inheritance Mapping Strategies with Entity Framework Code First CTP5: Part 3 – Table per Concrete Type (TPC) and Choosing Strategy Guidelines

    - by mortezam
    This is the third (and last) post in a series that explains different approaches to map an inheritance hierarchy with EF Code First. I've described these strategies in previous posts: Part 1 – Table per Hierarchy (TPH) Part 2 – Table per Type (TPT)In today’s blog post I am going to discuss Table per Concrete Type (TPC) which completes the inheritance mapping strategies supported by EF Code First. At the end of this post I will provide some guidelines to choose an inheritance strategy mainly based on what we've learned in this series. TPC and Entity Framework in the Past Table per Concrete type is somehow the simplest approach suggested, yet using TPC with EF is one of those concepts that has not been covered very well so far and I've seen in some resources that it was even discouraged. The reason for that is just because Entity Data Model Designer in VS2010 doesn't support TPC (even though the EF runtime does). That basically means if you are following EF's Database-First or Model-First approaches then configuring TPC requires manually writing XML in the EDMX file which is not considered to be a fun practice. Well, no more. You'll see that with Code First, creating TPC is perfectly possible with fluent API just like other strategies and you don't need to avoid TPC due to the lack of designer support as you would probably do in other EF approaches. Table per Concrete Type (TPC)In Table per Concrete type (aka Table per Concrete class) we use exactly one table for each (nonabstract) class. All properties of a class, including inherited properties, can be mapped to columns of this table, as shown in the following figure: As you can see, the SQL schema is not aware of the inheritance; effectively, we’ve mapped two unrelated tables to a more expressive class structure. If the base class was concrete, then an additional table would be needed to hold instances of that class. I have to emphasize that there is no relationship between the database tables, except for the fact that they share some similar columns. TPC Implementation in Code First Just like the TPT implementation, we need to specify a separate table for each of the subclasses. We also need to tell Code First that we want all of the inherited properties to be mapped as part of this table. In CTP5, there is a new helper method on EntityMappingConfiguration class called MapInheritedProperties that exactly does this for us. Here is the complete object model as well as the fluent API to create a TPC mapping: public abstract class BillingDetail {     public int BillingDetailId { get; set; }     public string Owner { get; set; }     public string Number { get; set; } }          public class BankAccount : BillingDetail {     public string BankName { get; set; }     public string Swift { get; set; } }          public class CreditCard : BillingDetail {     public int CardType { get; set; }     public string ExpiryMonth { get; set; }     public string ExpiryYear { get; set; } }      public class InheritanceMappingContext : DbContext {     public DbSet<BillingDetail> BillingDetails { get; set; }              protected override void OnModelCreating(ModelBuilder modelBuilder)     {         modelBuilder.Entity<BankAccount>().Map(m =>         {             m.MapInheritedProperties();             m.ToTable("BankAccounts");         });         modelBuilder.Entity<CreditCard>().Map(m =>         {             m.MapInheritedProperties();             m.ToTable("CreditCards");         });                 } } The Importance of EntityMappingConfiguration ClassAs a side note, it worth mentioning that EntityMappingConfiguration class turns out to be a key type for inheritance mapping in Code First. Here is an snapshot of this class: namespace System.Data.Entity.ModelConfiguration.Configuration.Mapping {     public class EntityMappingConfiguration<TEntityType> where TEntityType : class     {         public ValueConditionConfiguration Requires(string discriminator);         public void ToTable(string tableName);         public void MapInheritedProperties();     } } As you have seen so far, we used its Requires method to customize TPH. We also used its ToTable method to create a TPT and now we are using its MapInheritedProperties along with ToTable method to create our TPC mapping. TPC Configuration is Not Done Yet!We are not quite done with our TPC configuration and there is more into this story even though the fluent API we saw perfectly created a TPC mapping for us in the database. To see why, let's start working with our object model. For example, the following code creates two new objects of BankAccount and CreditCard types and tries to add them to the database: using (var context = new InheritanceMappingContext()) {     BankAccount bankAccount = new BankAccount();     CreditCard creditCard = new CreditCard() { CardType = 1 };                      context.BillingDetails.Add(bankAccount);     context.BillingDetails.Add(creditCard);     context.SaveChanges(); } Running this code throws an InvalidOperationException with this message: The changes to the database were committed successfully, but an error occurred while updating the object context. The ObjectContext might be in an inconsistent state. Inner exception message: AcceptChanges cannot continue because the object's key values conflict with another object in the ObjectStateManager. Make sure that the key values are unique before calling AcceptChanges. The reason we got this exception is because DbContext.SaveChanges() internally invokes SaveChanges method of its internal ObjectContext. ObjectContext's SaveChanges method on its turn by default calls AcceptAllChanges after it has performed the database modifications. AcceptAllChanges method merely iterates over all entries in ObjectStateManager and invokes AcceptChanges on each of them. Since the entities are in Added state, AcceptChanges method replaces their temporary EntityKey with a regular EntityKey based on the primary key values (i.e. BillingDetailId) that come back from the database and that's where the problem occurs since both the entities have been assigned the same value for their primary key by the database (i.e. on both BillingDetailId = 1) and the problem is that ObjectStateManager cannot track objects of the same type (i.e. BillingDetail) with the same EntityKey value hence it throws. If you take a closer look at the TPC's SQL schema above, you'll see why the database generated the same values for the primary keys: the BillingDetailId column in both BankAccounts and CreditCards table has been marked as identity. How to Solve The Identity Problem in TPC As you saw, using SQL Server’s int identity columns doesn't work very well together with TPC since there will be duplicate entity keys when inserting in subclasses tables with all having the same identity seed. Therefore, to solve this, either a spread seed (where each table has its own initial seed value) will be needed, or a mechanism other than SQL Server’s int identity should be used. Some other RDBMSes have other mechanisms allowing a sequence (identity) to be shared by multiple tables, and something similar can be achieved with GUID keys in SQL Server. While using GUID keys, or int identity keys with different starting seeds will solve the problem but yet another solution would be to completely switch off identity on the primary key property. As a result, we need to take the responsibility of providing unique keys when inserting records to the database. We will go with this solution since it works regardless of which database engine is used. Switching Off Identity in Code First We can switch off identity simply by placing DatabaseGenerated attribute on the primary key property and pass DatabaseGenerationOption.None to its constructor. DatabaseGenerated attribute is a new data annotation which has been added to System.ComponentModel.DataAnnotations namespace in CTP5: public abstract class BillingDetail {     [DatabaseGenerated(DatabaseGenerationOption.None)]     public int BillingDetailId { get; set; }     public string Owner { get; set; }     public string Number { get; set; } } As always, we can achieve the same result by using fluent API, if you prefer that: modelBuilder.Entity<BillingDetail>()             .Property(p => p.BillingDetailId)             .HasDatabaseGenerationOption(DatabaseGenerationOption.None); Working With The Object Model Our TPC mapping is ready and we can try adding new records to the database. But, like I said, now we need to take care of providing unique keys when creating new objects: using (var context = new InheritanceMappingContext()) {     BankAccount bankAccount = new BankAccount()      {          BillingDetailId = 1                          };     CreditCard creditCard = new CreditCard()      {          BillingDetailId = 2,         CardType = 1     };                      context.BillingDetails.Add(bankAccount);     context.BillingDetails.Add(creditCard);     context.SaveChanges(); } Polymorphic Associations with TPC is Problematic The main problem with this approach is that it doesn’t support Polymorphic Associations very well. After all, in the database, associations are represented as foreign key relationships and in TPC, the subclasses are all mapped to different tables so a polymorphic association to their base class (abstract BillingDetail in our example) cannot be represented as a simple foreign key relationship. For example, consider the the domain model we introduced here where User has a polymorphic association with BillingDetail. This would be problematic in our TPC Schema, because if User has a many-to-one relationship with BillingDetail, the Users table would need a single foreign key column, which would have to refer both concrete subclass tables. This isn’t possible with regular foreign key constraints. Schema Evolution with TPC is Complex A further conceptual problem with this mapping strategy is that several different columns, of different tables, share exactly the same semantics. This makes schema evolution more complex. For example, a change to a base class property results in changes to multiple columns. It also makes it much more difficult to implement database integrity constraints that apply to all subclasses. Generated SQLLet's examine SQL output for polymorphic queries in TPC mapping. For example, consider this polymorphic query for all BillingDetails and the resulting SQL statements that being executed in the database: var query = from b in context.BillingDetails select b; Just like the SQL query generated by TPT mapping, the CASE statements that you see in the beginning of the query is merely to ensure columns that are irrelevant for a particular row have NULL values in the returning flattened table. (e.g. BankName for a row that represents a CreditCard type). TPC's SQL Queries are Union Based As you can see in the above screenshot, the first SELECT uses a FROM-clause subquery (which is selected with a red rectangle) to retrieve all instances of BillingDetails from all concrete class tables. The tables are combined with a UNION operator, and a literal (in this case, 0 and 1) is inserted into the intermediate result; (look at the lines highlighted in yellow.) EF reads this to instantiate the correct class given the data from a particular row. A union requires that the queries that are combined, project over the same columns; hence, EF has to pad and fill up nonexistent columns with NULL. This query will really perform well since here we can let the database optimizer find the best execution plan to combine rows from several tables. There is also no Joins involved so it has a better performance than the SQL queries generated by TPT where a Join is required between the base and subclasses tables. Choosing Strategy GuidelinesBefore we get into this discussion, I want to emphasize that there is no one single "best strategy fits all scenarios" exists. As you saw, each of the approaches have their own advantages and drawbacks. Here are some rules of thumb to identify the best strategy in a particular scenario: If you don’t require polymorphic associations or queries, lean toward TPC—in other words, if you never or rarely query for BillingDetails and you have no class that has an association to BillingDetail base class. I recommend TPC (only) for the top level of your class hierarchy, where polymorphism isn’t usually required, and when modification of the base class in the future is unlikely. If you do require polymorphic associations or queries, and subclasses declare relatively few properties (particularly if the main difference between subclasses is in their behavior), lean toward TPH. Your goal is to minimize the number of nullable columns and to convince yourself (and your DBA) that a denormalized schema won’t create problems in the long run. If you do require polymorphic associations or queries, and subclasses declare many properties (subclasses differ mainly by the data they hold), lean toward TPT. Or, depending on the width and depth of your inheritance hierarchy and the possible cost of joins versus unions, use TPC. By default, choose TPH only for simple problems. For more complex cases (or when you’re overruled by a data modeler insisting on the importance of nullability constraints and normalization), you should consider the TPT strategy. But at that point, ask yourself whether it may not be better to remodel inheritance as delegation in the object model (delegation is a way of making composition as powerful for reuse as inheritance). Complex inheritance is often best avoided for all sorts of reasons unrelated to persistence or ORM. EF acts as a buffer between the domain and relational models, but that doesn’t mean you can ignore persistence concerns when designing your classes. SummaryIn this series, we focused on one of the main structural aspect of the object/relational paradigm mismatch which is inheritance and discussed how EF solve this problem as an ORM solution. We learned about the three well-known inheritance mapping strategies and their implementations in EF Code First. Hopefully it gives you a better insight about the mapping of inheritance hierarchies as well as choosing the best strategy for your particular scenario. Happy New Year and Happy Code-Firsting! References ADO.NET team blog Java Persistence with Hibernate book a { color: #5A99FF; } a:visited { color: #5A99FF; } .title { padding-bottom: 5px; font-family: Segoe UI; font-size: 11pt; font-weight: bold; padding-top: 15px; } .code, .typeName { font-family: consolas; } .typeName { color: #2b91af; } .padTop5 { padding-top: 5px; } .padTop10 { padding-top: 10px; } .exception { background-color: #f0f0f0; font-style: italic; padding-bottom: 5px; padding-left: 5px; padding-top: 5px; padding-right: 5px; }

    Read the article

  • URL Rewrite – Multiple domains under one site. Part II

    - by OWScott
    I believe I have it … I’ve been meaning to put together the ultimate outgoing rule for hosting multiple domains under one site.  I finally sat down this week and setup a few test cases, and created one rule to rule them all.  In Part I of this two part series, I covered the incoming rule necessary to host a site in a subfolder of a website, while making it appear as if it’s in the root of the site.  Part II won’t work without applying Part I first, so if you haven’t read it, I encourage you to read it now. However, the incoming rule by itself doesn’t address everything.  Here’s the problem … Let’s say that we host www.site2.com in a subfolder called site2, off of masterdomain.com.  This is the same example I used in Part I.   Using an incoming rewrite rule, we are able to make a request to www.site2.com even though the site is really in the /site2 folder.  The gotcha comes with any type of path that ASP.NET generates (I’m sure other scripting technologies could do the same too).  ASP.NET thinks that the path to the root of the site is /site2, but the URL is /.  See the issue?  If ASP.NET generates a path or a redirect for us, it will always add /site2 to the URL.  That results in a path that looks something like www.site2.com/site2.  In Part I, I mentioned that you should add a condition where “{PATH_INFO} ‘does not match’ /site2”.  That allows www.site2.com/site2 and www.site2.com to both function the same.  This allows the site to always work, but if you want to hide /site2 in the URL, you need to take it one step further. One way to address this is in your code.  Ultimately this is the best bet.  Ruslan Yakushev has a great article on a few considerations that you can address in code.  I recommend giving that serious consideration.  Additionally, if you have upgraded to ASP.NET 3.5 SP1 or greater, it takes care of some of the references automatically for you. However, what if you inherit an existing application?  Or you can’t easily go through your existing site and make the code changes?  If this applies to you, read on. That’s where URL Rewrite 2.0 comes in.  With URL Rewrite 2.0, you can create an outgoing rule that will remove the /site2 before the page is sent back to the user.  This means that you can take an existing application, host it in a subfolder of your site, and ensure that the URL never reveals that it’s in a subfolder. Performance Considerations Performance overhead is something to be mindful of.  These outbound rules aren’t simply changing the server variables.  The first rule I’ll cover below needs to parse the HTML body and pull out the path (i.e. /site2) on the way through.  This will add overhead, possibly significant if you have large pages and a busy site.  In other words, your mileage may vary and you may need to test to see the impact that these rules have.  Don’t worry too much though.  For many sites, the performance impact is negligible. So, how do we do it? Creating the Outgoing Rule There are really two things to keep in mind.  First, ASP.NET applications frequently generate a URL that adds the /site2 back into the URL.  In addition to URLs, they can be in form elements, img elements and the like.  The goal is to find all of those situations and rewrite it on the way out.  Let’s call this the ‘URL problem’. Second, and similarly, ASP.NET can send a LOCATION redirect that causes a redirect back to another page.  Again, ASP.NET isn’t aware of the different URL and it will add the /site2 to the redirect.  Form Authentication is a good example on when this occurs.  Try to password protect a site running from a subfolder using forms auth and you’ll quickly find that the URL becomes www.site2.com/site2 again.  Let’s term this the ‘redirect problem’. Solving the URL Problem – Outgoing Rule #1 Let’s create a rule that removes the /site2 from any URL.  We want to remove it from relative URLs like /site2/something, or absolute URLs like http://www.site2.com/site2/something.  Most URLs that ASP.NET creates will be relative URLs, but I figure that there may be some applications that piece together a full URL, so we might as well expect that situation. Let’s get started.  First, create a new outbound rule.  You can create the rule within the /site2 folder which will reduce the performance impact of the rule.  Just a reminder that incoming rules for this situation won’t work in a subfolder … but outgoing rules will. Give it a name that makes sense to you, for example “Outgoing – URL paths”. Precondition.  If you place the rule in the subfolder, it will only run for that site and folder, so there isn’t need for a precondition.  Run it for all requests.  If you place it in the root of the site, you may want to create a precondition for HTTP_HOST = ^(www\.)?site2\.com$. For the Match section, there are a few things to consider.  For performance reasons, it’s best to match the least amount of elements that you need to accomplish the task.  For my test cases, I just needed to rewrite the <a /> tag, but you may need to rewrite any number of HTML elements.  Note that as long as you have the exclude /site2 rule in your incoming rule as I described in Part I, some elements that don’t show their URL—like your images—will work without removing the /site2 from them.  That reduces the processing needed for this rule. Leave the “matching scope” at “Response” and choose the elements that you want to change. Set the pattern to “^(?:site2|(.*//[_a-zA-Z0-9-\.]*)?/site2)(.*)”.  Make sure to replace ‘site2’ with your subfolder name in both places.  Yes, I realize this is a pretty messy looking rule, but it handles a few situations.  This rule will handle the following situations correctly: Original Rewritten using {R:1}{R:2} http://www.site2.com/site2/default.aspx http://www.site2.com/default.aspx http://www.site2.com/folder1/site2/default.aspx Won’t rewrite since it’s a sub-sub folder /site2/default.aspx /default.aspx site2/default.aspx /default.aspx /folder1/site2/default.aspx Won’t rewrite since it’s a sub-sub folder. For the conditions section, you can leave that be. Finally, for the rule, set the Action Type to “Rewrite” and set the Value to “{R:1}{R:2}”.  The {R:1} and {R:2} are back references to the sections within parentheses.  In other words, in http://domain.com/site2/something, {R:1} will be http://domain.com and {R:2} will be /something. If you view your rule from your web.config file (or applicationHost.config if it’s a global rule), it should look like this: <rule name="Outgoing - URL paths" enabled="true"> <match filterByTags="A" pattern="^(?:site2|(.*//[_a-zA-Z0-9-\.]*)?/site2)(.*)" /> <action type="Rewrite" value="{R:1}{R:2}" /> </rule> Solving the Redirect Problem Outgoing Rule #2 The second issue that we can run into is with a client-side redirect.  This is triggered by a LOCATION response header that is sent to the client.  Forms authentication is a common example.  To reproduce this, password protect your subfolder and watch how it redirects and adds the subfolder path back in. Notice in my test case the extra paths: http://site2.com/site2/login.aspx?ReturnUrl=%2fsite2%2fdefault.aspx I want to remove /site2 from both the URL and the ReturnUrl querystring value.  For semi-readability, let’s do this in 2 separate rules, one for the URL and one for the querystring. Create a second rule.  As with the previous rule, it can be created in the /site2 subfolder.  In the URL Rewrite wizard, select Outbound rules –> “Blank Rule”. Fill in the following information: Name response_location URL Precondition Don’t set Match: Matching Scope Server Variable Match: Variable Name RESPONSE_LOCATION Match: Pattern ^(?:site2|(.*//[_a-zA-Z0-9-\.]*)?/site2)(.*) Conditions Don’t set Action Type Rewrite Action Properties {R:1}{R:2} It should end up like so: <rule name="response_location URL"> <match serverVariable="RESPONSE_LOCATION" pattern="^(?:site2|(.*//[_a-zA-Z0-9-\.]*)?/site2)(.*)" /> <action type="Rewrite" value="{R:1}{R:2}" /> </rule> Outgoing Rule #3 Outgoing Rule #2 only takes care of the URL path, and not the querystring path.  Let’s create one final rule to take care of the path in the querystring to ensure that ReturnUrl=%2fsite2%2fdefault.aspx gets rewritten to ReturnUrl=%2fdefault.aspx. The %2f is the HTML encoding for forward slash (/). Create a rule like the previous one, but with the following settings: Name response_location querystring Precondition Don’t set Match: Matching Scope Server Variable Match: Variable Name RESPONSE_LOCATION Match: Pattern (.*)%2fsite2(.*) Conditions Don’t set Action Type Rewrite Action Properties {R:1}{R:2} The config should look like this: <rule name="response_location querystring"> <match serverVariable="RESPONSE_LOCATION" pattern="(.*)%2fsite2(.*)" /> <action type="Rewrite" value="{R:1}{R:2}" /> </rule> It’s possible to squeeze the last two rules into one, but it gets kind of confusing so I felt that it’s better to show it as two separate rules. Summary With the rules covered in these two parts, we’re able to have a site in a subfolder and make it appear as if it’s in the root of the site.  Not only that, we can overcome automatic redirecting that is caused by ASP.NET, other scripting technologies, and especially existing applications. Following is an example of the incoming and outgoing rules necessary for a site called www.site2.com hosted in a subfolder called /site2.  Remember that the outgoing rules can be placed in the /site2 folder instead of the in the root of the site. <rewrite> <rules> <rule name="site2.com in a subfolder" enabled="true" stopProcessing="true"> <match url=".*" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^(www\.)?site2\.com$" /> <add input="{PATH_INFO}" pattern="^/site2($|/)" negate="true" /> </conditions> <action type="Rewrite" url="/site2/{R:0}" /> </rule> </rules> <outboundRules> <rule name="Outgoing - URL paths" enabled="true"> <match filterByTags="A" pattern="^(?:site2|(.*//[_a-zA-Z0-9-\.]*)?/site2)(.*)" /> <action type="Rewrite" value="{R:1}{R:2}" /> </rule> <rule name="response_location URL"> <match serverVariable="RESPONSE_LOCATION" pattern="^(?:site2|(.*//[_a-zA-Z0-9-\.]*)?/site2)(.*)" /> <action type="Rewrite" value="{R:1}{R:2}" /> </rule> <rule name="response_location querystring"> <match serverVariable="RESPONSE_LOCATION" pattern="(.*)%2fsite2(.*)" /> <action type="Rewrite" value="{R:1}{R:2}" /> </rule> </outboundRules> </rewrite> If you run into any situations that aren’t caught by these rules, please let me know so I can update this to be as complete as possible. Happy URL Rewriting!

    Read the article

  • JMS Step 6 - How to Set Up an AQ JMS (Advanced Queueing JMS) for SOA Purposes

    - by John-Brown.Evans
    JMS Step 6 - How to Set Up an AQ JMS (Advanced Queueing JMS) for SOA Purposes .jblist{list-style-type:disc;margin:0;padding:0;padding-left:0pt;margin-left:36pt} ol{margin:0;padding:0} .c17_6{vertical-align:top;width:468pt;border-style:solid;border-color:#000000;border-width:1pt;padding:5pt 5pt 5pt 5pt} .c5_6{vertical-align:top;border-style:solid;border-color:#000000;border-width:1pt;padding:0pt 5pt 0pt 5pt} .c6_6{vertical-align:top;width:156pt;border-style:solid;border-color:#000000;border-width:1pt;padding:5pt 5pt 5pt 5pt} .c15_6{background-color:#ffffff} .c10_6{color:#1155cc;text-decoration:underline} .c1_6{text-align:center;direction:ltr} .c0_6{line-height:1.0;direction:ltr} .c16_6{color:#666666;font-size:12pt} .c18_6{color:inherit;text-decoration:inherit} .c8_6{background-color:#f3f3f3} .c2_6{direction:ltr} .c14_6{font-size:8pt} .c11_6{font-size:10pt} .c7_6{font-weight:bold} .c12_6{height:0pt} .c3_6{height:11pt} .c13_6{border-collapse:collapse} .c4_6{font-family:"Courier New"} .c9_6{font-style:italic} .title{padding-top:24pt;line-height:1.15;text-align:left;color:#000000;font-size:36pt;font-family:"Arial";font-weight:bold;padding-bottom:6pt} .subtitle{padding-top:18pt;line-height:1.15;text-align:left;color:#666666;font-style:italic;font-size:24pt;font-family:"Georgia";padding-bottom:4pt} li{color:#000000;font-size:10pt;font-family:"Arial"} p{color:#000000;font-size:10pt;margin:0;font-family:"Arial"} h1{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:24pt;font-family:"Arial";font-weight:normal} h2{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:18pt;font-family:"Arial";font-weight:normal} h3{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:14pt;font-family:"Arial";font-weight:normal} h4{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:12pt;font-family:"Arial";font-weight:normal} h5{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:11pt;font-family:"Arial";font-weight:normal} h6{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:10pt;font-family:"Arial";font-weight:normal} This post continues the series of JMS articles which demonstrate how to use JMS queues in a SOA context. The previous posts were: JMS Step 1 - How to Create a Simple JMS Queue in Weblogic Server 11g JMS Step 2 - Using the QueueSend.java Sample Program to Send a Message to a JMS Queue JMS Step 3 - Using the QueueReceive.java Sample Program to Read a Message from a JMS Queue JMS Step 4 - How to Create an 11g BPEL Process Which Writes a Message Based on an XML Schema to a JMS Queue JMS Step 5 - How to Create an 11g BPEL Process Which Reads a Message Based on an XML Schema from a JMS Queue This example leads you through the creation of an Oracle database Advanced Queue and the related WebLogic server objects in order to use AQ JMS in connection with a SOA composite. If you have not already done so, I recommend you look at the previous posts in this series, as they include steps which this example builds upon. The following examples will demonstrate how to write and read from the queue from a SOA process. 1. Recap and Prerequisites In the previous examples, we created a JMS Queue, a Connection Factory and a Connection Pool in the WebLogic Server Console. Then we wrote and deployed BPEL composites, which enqueued and dequeued a simple XML payload. AQ JMS allows you to interoperate with database Advanced Queueing via JMS in WebLogic server and therefore take advantage of database features, while maintaining compliance with the JMS architecture. AQ JMS uses the WebLogic JMS Foreign Server framework. A full description of this functionality can be found in the following Oracle documentation Oracle® Fusion Middleware Configuring and Managing JMS for Oracle WebLogic Server 11g Release 1 (10.3.6) Part Number E13738-06 7. Interoperating with Oracle AQ JMS http://docs.oracle.com/cd/E23943_01/web.1111/e13738/aq_jms.htm#CJACBCEJ For easier reference, this sample will use the same names for the objects as in the above document, except for the name of the database user, as it is possible that this user already exists in your database. We will create the following objects Database Objects Name Type AQJMSUSER Database User MyQueueTable Advanced Queue (AQ) Table UserQueue Advanced Queue WebLogic Server Objects Object Name Type JNDI Name aqjmsuserDataSource Data Source jdbc/aqjmsuserDataSource AqJmsModule JMS System Module AqJmsForeignServer JMS Foreign Server AqJmsForeignServerConnectionFactory JMS Foreign Server Connection Factory AqJmsForeignServerConnectionFactory AqJmsForeignDestination AQ JMS Foreign Destination queue/USERQUEUE eis/aqjms/UserQueue Connection Pool eis/aqjms/UserQueue 2. Create a Database User and Advanced Queue The following steps can be executed in the database client of your choice, e.g. JDeveloper or SQL Developer. The examples below use SQL*Plus. Log in to the database as a DBA user, for example SYSTEM or SYS. Create the AQJMSUSER user and grant privileges to enable the user to create AQ objects. Create Database User and Grant AQ Privileges sqlplus system/password as SYSDBA GRANT connect, resource TO aqjmsuser IDENTIFIED BY aqjmsuser; GRANT aq_user_role TO aqjmsuser; GRANT execute ON sys.dbms_aqadm TO aqjmsuser; GRANT execute ON sys.dbms_aq TO aqjmsuser; GRANT execute ON sys.dbms_aqin TO aqjmsuser; GRANT execute ON sys.dbms_aqjms TO aqjmsuser; Create the Queue Table and Advanced Queue and Start the AQ The following commands are executed as the aqjmsuser database user. Create the Queue Table connect aqjmsuser/aqjmsuser; BEGIN dbms_aqadm.create_queue_table ( queue_table = 'myQueueTable', queue_payload_type = 'sys.aq$_jms_text_message', multiple_consumers = false ); END; / Create the AQ BEGIN dbms_aqadm.create_queue ( queue_name = 'userQueue', queue_table = 'myQueueTable' ); END; / Start the AQ BEGIN dbms_aqadm.start_queue ( queue_name = 'userQueue'); END; / The above commands can be executed in a single PL/SQL block, but are shown as separate blocks in this example for ease of reference. You can verify the queue by executing the SQL command SELECT object_name, object_type FROM user_objects; which should display the following objects: OBJECT_NAME OBJECT_TYPE ------------------------------ ------------------- SYS_C0056513 INDEX SYS_LOB0000170822C00041$$ LOB SYS_LOB0000170822C00040$$ LOB SYS_LOB0000170822C00037$$ LOB AQ$_MYQUEUETABLE_T INDEX AQ$_MYQUEUETABLE_I INDEX AQ$_MYQUEUETABLE_E QUEUE AQ$_MYQUEUETABLE_F VIEW AQ$MYQUEUETABLE VIEW MYQUEUETABLE TABLE USERQUEUE QUEUE Similarly, you can view the objects in JDeveloper via a Database Connection to the AQJMSUSER. 3. Configure WebLogic Server and Add JMS Objects All these steps are executed from the WebLogic Server Administration Console. Log in as the webLogic user. Configure a WebLogic Data Source The data source is required for the database connection to the AQ created above. Navigate to domain > Services > Data Sources and press New then Generic Data Source. Use the values:Name: aqjmsuserDataSource JNDI Name: jdbc/aqjmsuserDataSource Database type: Oracle Database Driver: *Oracle’ Driver (Thin XA) for Instance connections; Versions:9.0.1 and later Connection Properties: Enter the connection information to the database containing the AQ created above and enter aqjmsuser for the User Name and Password. Press Test Configuration to verify the connection details and press Next. Target the data source to the soa server. The data source will be displayed in the list. It is a good idea to test the data source at this stage. Click on aqjmsuserDataSource, select Monitoring > Testing > soa_server1 and press Test Data Source. The result is displayed at the top of the page. Configure a JMS System Module The JMS system module is required to host the JMS foreign server for AQ resources. Navigate to Services > Messaging > JMS Modules and select New. Use the values: Name: AqJmsModule (Leave Descriptor File Name and Location in Domain empty.) Target: soa_server1 Click Finish. The other resources will be created in separate steps. The module will be displayed in the list.   Configure a JMS Foreign Server A foreign server is required in order to reference a 3rd-party JMS provider, in this case the database AQ, within a local WebLogic server JNDI tree. Navigate to Services > Messaging > JMS Modules and select (click on) AqJmsModule to configure it. Under Summary of Resources, select New then Foreign Server. Name: AqJmsForeignServer Targets: The foreign server is targeted automatically to soa_server1, based on the JMS module’s target. Press Finish to create the foreign server. The foreign server resource will be listed in the Summary of Resources for the AqJmsModule, but needs additional configuration steps. Click on AqJmsForeignServer and select Configuration > General to complete the configuration: JNDI Initial Context Factory: oracle.jms.AQjmsInitialContextFactory JNDI Connection URL: <empty> JNDI Properties Credential:<empty> Confirm JNDI Properties Credential: <empty> JNDI Properties: datasource=jdbc/aqjmsuserDataSource This is an important property. It is the JNDI name of the data source created above, which points to the AQ schema in the database and must be entered as a name=value pair, as in this example, e.g. datasource=jdbc/aqjmsuserDataSource, including the “datasource=” property name. Default Targeting Enabled: Leave this value checked. Press Save to save the configuration. At this point it is a good idea to verify that the data source was written correctly to the config file. In a terminal window, navigate to $MIDDLEWARE_HOME/user_projects/domains/soa_domain/config/jms  and open the file aqjmsmodule-jms.xml . The foreign server configuration should contain the datasource name-value pair, as follows:   <foreign-server name="AqJmsForeignServer">         <default-targeting-enabled>true</default-targeting-enabled>         <initial-context-factory>oracle.jms.AQjmsInitialContextFactory</initial-context-factory>         <jndi-property>           <key> datasource </key>           <value> jdbc/aqjmsuserDataSource </value>         </jndi-property>   </foreign-server> </weblogic-jms> Configure a JMS Foreign Server Connection Factory When creating the foreign server connection factory, you enter local and remote JNDI names. The name of the connection factory itself and the local JNDI name are arbitrary, but the remote JNDI name must match a specific format, depending on the type of queue or topic to be accessed in the database. This is very important and if the incorrect value is used, the connection to the queue will not be established and the error messages you get will not immediately reflect the cause of the error. The formats required (Remote JNDI names for AQ JMS Connection Factories) are described in the section Configure AQ Destinations  of the Oracle® Fusion Middleware Configuring and Managing JMS for Oracle WebLogic Server document mentioned earlier. In this example, the remote JNDI name used is   XAQueueConnectionFactory  because it matches the AQ and data source created earlier, i.e. thin with AQ. Navigate to JMS Modules > AqJmsModule > AqJmsForeignServer > Connection Factories then New.Name: AqJmsForeignServerConnectionFactory Local JNDI Name: AqJmsForeignServerConnectionFactory Note: this local JNDI name is the JNDI name which your client application, e.g. a later BPEL process, will use to access this connection factory. Remote JNDI Name: XAQueueConnectionFactory Press OK to save the configuration. Configure an AQ JMS Foreign Server Destination A foreign server destination maps the JNDI name on the foreign JNDI provider to the respective local JNDI name, allowing the foreign JNDI name to be accessed via the local server. As with the foreign server connection factory, the local JNDI name is arbitrary (but must be unique), but the remote JNDI name must conform to a specific format defined in the section Configure AQ Destinations  of the Oracle® Fusion Middleware Configuring and Managing JMS for Oracle WebLogic Server document mentioned earlier. In our example, the remote JNDI name is Queues/USERQUEUE , because it references a queue (as opposed to a topic) with the name USERQUEUE. We will name the local JNDI name queue/USERQUEUE, which is a little confusing (note the missing “s” in “queue), but conforms better to the JNDI nomenclature in our SOA server and also allows us to differentiate between the local and remote names for demonstration purposes. Navigate to JMS Modules > AqJmsModule > AqJmsForeignServer > Destinations and select New.Name: AqJmsForeignDestination Local JNDI Name: queue/USERQUEUE Remote JNDI Name:Queues/USERQUEUE After saving the foreign destination configuration, this completes the JMS part of the configuration. We still need to configure the JMS adapter in order to be able to access the queue from a BPEL processt. 4. Create a JMS Adapter Connection Pool in Weblogic Server Create the Connection Pool Access to the AQ JMS queue from a BPEL or other SOA process in our example is done via a JMS adapter. To enable this, the JmsAdapter in WebLogic server needs to be configured to have a connection pool which points to the local connection factory JNDI name which was created earlier. Navigate to Deployments > Next and select (click on) the JmsAdapter. Select Configuration > Outbound Connection Pools and New. Check the radio button for oracle.tip.adapter.jms.IJmsConnectionFactory and press Next. JNDI Name: eis/aqjms/UserQueue Press Finish Expand oracle.tip.adapter.jms.IJmsConnectionFactory and click on eis/aqjms/UserQueue to configure it. The ConnectionFactoryLocation must point to the foreign server’s local connection factory name created earlier. In our example, this is AqJmsForeignServerConnectionFactory . As a reminder, this connection factory is located under JMS Modules > AqJmsModule > AqJmsForeignServer > Connection Factories and the value needed here is under Local JNDI Name. Enter AqJmsForeignServerConnectionFactory  into the Property Value field for ConnectionFactoryLocation. You must then press Return/Enter then Save for the value to be accepted. If your WebLogic server is running in Development mode, you should see the message that the changes have been activated and the deployment plan successfully updated. If not, then you will manually need to activate the changes in the WebLogic server console.Although the changes have been activated, the JmsAdapter needs to be redeployed in order for the changes to become effective. This should be confirmed by the message Remember to update your deployment to reflect the new plan when you are finished with your changes. Redeploy the JmsAdapter Navigate back to the Deployments screen, either by selecting it in the left-hand navigation tree or by selecting the “Summary of Deployments” link in the breadcrumbs list at the top of the screen. Then select the checkbox next to JmsAdapter and press the Update button. On the Update Application Assistant page, select “Redeploy this application using the following deployment files” and press Finish. After a few seconds you should get the message that the selected deployments were updated. The JMS adapter configuration is complete and it can now be used to access the AQ JMS queue. You can verify that the JNDI name was created correctly, by navigating to Environment > Servers > soa_server1 and View JNDI Tree. Then scroll down in the JNDI Tree Structure to eis and select aqjms. This concludes the sample. In the following post, I will show you how to create a BPEL process which sends a message to this advanced queue via JMS. Best regards John-Brown Evans Oracle Technology Proactive Support Delivery

    Read the article

  • Import Data from Excel sheet to DB Table through OAF page

    - by PRajkumar
    1. Create a New Workspace and Project File > New > General > Workspace Configured for Oracle Applications File Name – PrajkumarImportxlsDemo   Automatically a new OA Project will also be created   Project Name -- ImportxlsDemo Default Package -- prajkumar.oracle.apps.fnd.importxlsdemo   2. Add JAR file jxl-2.6.3.jar to Apache Library Download jxl-2.6.3.jar from following link – http://www.findjar.com/jar/net.sourceforge.jexcelapi/jars/jxl-2.6.jar.html   Steps to add jxl.jar file in Local Machine Right Click on ImportxlsDemo > Project Properties > Libraries > Add jar/Directory and browse to directory where jxl-2.6.3.jar has been downloaded and select the JAR file            Steps to add jxl.jar file at EBS middle tier On your EBS middile tier copy jxl.jar at $FND_TOP/java/3rdparty/standalone Add $FND_TOP/java/3rdparty/standalone\jxl.jar to custom classpath in Jser.properties file which is at $IAS_ORACLE_HOME/Apache/Jserv/etc wrapper.classpath=/U01/oracle/dev/devappl/fnd/11.5.0/java/3rdparty/stdalone/jxl.jar Bounce Apache Server   3. Create a New Application Module (AM) Right Click on ImportxlsDemo > New > ADF Business Components > Application Module Name -- ImportxlsAM Package -- prajkumar.oracle.apps.fnd.importxlsdemo.server   Check Application Module Class: ImportxlsAMImpl Generate JavaFile(s)   4. Create Test Table in which we will insert data from excel CREATE TABLE xx_import_excel_data_demo (    -- --------------------      -- Data Columns      -- --------------------      column1                 VARCHAR2(100),      column2                 VARCHAR2(100),      column3                 VARCHAR2(100),      column4                 VARCHAR2(100),      column5                 VARCHAR2(100),      -- --------------------      -- Who Columns      -- --------------------      last_update_date   DATE         NOT NULL,      last_updated_by    NUMBER   NOT NULL,      creation_date         DATE         NOT NULL,      created_by             NUMBER    NOT NULL,      last_update_login  NUMBER );   5. Create a New Entity Object (EO) Right click on ImportxlsDemo > New > ADF Business Components > Entity Object Name – ImportxlsEO Package -- prajkumar.oracle.apps.fnd.importxlsdemo.schema.server Database Objects -- XX_IMPORT_EXCEL_DATA_DEMO   Note – By default ROWID will be the primary key if we will not make any column to be primary key Check the Accessors, Create Method, Validation Method and Remove Method   6. Create a New View Object (VO) Right click on ImportxlsDemo > New > ADF Business Components > View Object Name -- ImportxlsVO Package -- prajkumar.oracle.apps.fnd.importxlsdemo.server   In Step2 in Entity Page select ImportxlsEO and shuttle it to selected list In Step3 in Attributes Window select all columns and shuttle them to selected list   In Java page Uncheck Generate Java file for View Object Class: ImportxlsVOImpl Select Generate Java File for View Row Class: ImportxlsVORowImpl -> Generate Java File -> Accessors   7. Add Your View Object to Root UI Application Module Right click on ImportxlsAM > Edit ImportxlsAM > Data Model > Select ImportxlsVO and shuttle to Data Model list   8. Create a New Page Right click on ImportxlsDemo > New > Web Tier > OA Components > Page Name -- ImportxlsPG Package -- prajkumar.oracle.apps.fnd.importxlsdemo.webui   9. Select the ImportxlsPG and go to the strcuture pane where a default region has been created   10. Select region1 and set the following properties:   Attribute Property ID PageLayoutRN AM Definition prajkumar.oracle.apps.fnd.importxlsdemo.server.ImportxlsAM Window Title Import Data From Excel through OAF Page Demo Window Title Import Data From Excel through OAF Page Demo   11. Create messageComponentLayout Region Under Page Layout Region Right click PageLayoutRN > New > Region   Attribute Property ID MainRN Item Style messageComponentLayout   12. Create a New Item messageFileUpload Bean under MainRN Right click on MainRN > New > messageFileUpload Set Following Properties for New Item --   Attribute Property ID MessageFileUpload Item Style messageFileUpload   13. Create a New Item Submit Button Bean under MainRN Right click on MainRN > New > messageLayout Set Following Properties for messageLayout --   Attribute Property ID ButtonLayout   Right Click on ButtonLayout > New > Item   Attribute Property ID Go Item Style submitButton Attribute Set /oracle/apps/fnd/attributesets/Buttons/Go   14. Create Controller for page ImportxlsPG Right Click on PageLayoutRN > Set New Controller Package Name: prajkumar.oracle.apps.fnd.importxlsdemo.webui Class Name: ImportxlsCO   Write Following Code in ImportxlsCO in processFormRequest import oracle.apps.fnd.framework.OAApplicationModule; import oracle.apps.fnd.framework.OAException; import java.io.Serializable; import oracle.apps.fnd.framework.webui.OAControllerImpl; import oracle.apps.fnd.framework.webui.OAPageContext; import oracle.apps.fnd.framework.webui.beans.OAWebBean; import oracle.cabo.ui.data.DataObject; import oracle.jbo.domain.BlobDomain; public void processFormRequest(OAPageContext pageContext, OAWebBean webBean) {  super.processFormRequest(pageContext, webBean);  if (pageContext.getParameter("Go") != null)  {   DataObject fileUploadData = (DataObject)pageContext.getNamedDataObject("MessageFileUpload");   String fileName = null;                 try   {    fileName = (String)fileUploadData.selectValue(null, "UPLOAD_FILE_NAME");   }   catch(NullPointerException ex)   {    throw new OAException("Please Select a File to Upload", OAException.ERROR);   }   BlobDomain uploadedByteStream = (BlobDomain)fileUploadData.selectValue(null, fileName);   try   {    OAApplicationModule oaapplicationmodule = pageContext.getRootApplicationModule();    Serializable aserializable2[] = {uploadedByteStream};    Class aclass2[] = {BlobDomain.class };    oaapplicationmodule.invokeMethod("ReadExcel", aserializable2,aclass2);   }   catch (Exception ex)   {    throw new OAException(ex.toString(), OAException.ERROR);   }  } }     Write Following Code in ImportxlsAMImpl.java import java.io.IOException; import java.io.InputStream; import jxl.Cell; import jxl.CellType; import jxl.Sheet; import jxl.Workbook; import jxl.read.biff.BiffException; import oracle.apps.fnd.framework.server.OAApplicationModuleImpl; import oracle.jbo.Row; import oracle.apps.fnd.framework.OAViewObject; import oracle.apps.fnd.framework.server.OAViewObjectImpl; import oracle.jbo.domain.BlobDomain; public void createRecord(String[] excel_data) {   OAViewObject vo = (OAViewObject)getImportxlsVO1();            if (!vo.isPreparedForExecution())    {   vo.executeQuery();      }                      Row row = vo.createRow();  try  {   for (int i=0; i < excel_data.length; i++)   {    row.setAttribute("Column" +(i+1) ,excel_data[i]);   }  }  catch(Exception e)  {   System.out.println(e.getMessage());   }  vo.insertRow(row);  getTransaction().commit(); }      public void ReadExcel(BlobDomain fileData) throws IOException {  String[] excel_data  = new String[5];  InputStream inputWorkbook = fileData.getInputStream();  Workbook w;          try  {   w = Workbook.getWorkbook(inputWorkbook);                       // Get the first sheet   Sheet sheet = w.getSheet(0);                       for (int i = 0; i < sheet.getRows(); i++)   {    for (int j = 0; j < sheet.getColumns(); j++)    {     Cell cell = sheet.getCell(j, i);     CellType type = cell.getType();     if (cell.getType() == CellType.LABEL)     {      System.out.println("I got a label " + cell.getContents());      excel_data[j] = cell.getContents();     }     if (cell.getType() == CellType.NUMBER)     {        System.out.println("I got a number " + cell.getContents());      excel_data[j] = cell.getContents();     }    }    createRecord(excel_data);   }  }              catch (BiffException e)  {   e.printStackTrace();  } }   15. Congratulation you have successfully finished. Run Your page and Test Your Work   Consider Excel PRAJ_TEST.xls with following data --       Lets Try to import this data into DB Table --          

    Read the article

  • Cold Start

    - by antony.reynolds
    Well we had snow drifts 3ft deep on Saturday so it must be spring time.  In preparation for Spring we decided to move the lawn tractor.  Of course after sitting in the garage all winter it refused to start.  I then come into the office and need to start my 11g SOA Suite installation.  I thought about this and decided my tractor might be cranky but at least I can script the startup of my SOA Suite 11g installation. So with this in mind I created 6 scripts.  I created them for Linux but they should translate to Windows without too many problems.  This is left as an exercise to the reader, note you will have to hardcode more than I did in the Linux scripts and create separate script files for the sqlplus and WLST sections. Order to start things I believe there should be order in all things, especially starting the SOA Suite.  So here is my preferred order. Start Database This is need by EM and the rest of SOA Suite so best to start it before the Admin Server and managed servers. Start Node Manager on all machines This is needed if you want the scripts to work across machines. Start Admin Server Once this is done in theory you can manually stat the managed servers using WebLogic console.  But then you have to wait for console to be available.  Scripting it all is quicker and easier way of starting. Start Managed Servers & Clusters Best to start them one per physical machine at a time to avoid undue load on the machines.  Non-clustered install will have just soa_server1 and bam_serv1 by default.  Clusters will have at least SOA and BAM clusters that can be started as a group or individually.  I have provided scripts for standalone servers, but easy to change them to work with clusters. Starting Database I have provided a very primitive script (available here) to start the database, the listener and the DB console.  The section highlighted in red needs to match your database name. #!/bin/sh echo "##############################" echo "# Setting Oracle Environment #" echo "##############################" . oraenv <<-EOF orcl EOF echo "#####################" echo "# Starting Database #" echo "#####################" sqlplus / as sysdba <<-EOF startup exit EOF echo "#####################" echo "# Starting Listener #" echo "#####################" lsnrctl start echo "######################" echo "# Starting dbConsole #" echo "######################" emctl start dbconsole read -p "Hit <enter> to continue" Starting SOA Suite My script for starting the SOA Suite (available here) breaks the task down into five sections. Setting the Environment First set up the environment variables.  The variables highlighted in red probably need changing for your environment. #!/bin/sh echo "###########################" echo "# Setting SOA Environment #" echo "###########################" export MW_HOME=~oracle/Middleware11gPS1 export WL_HOME=$MW_HOME/wlserver_10.3 export ORACLE_HOME=$MW_HOME/Oracle_SOA export DOMAIN_NAME=soa_std_domain export DOMAIN_HOME=$MW_HOME/user_projects/domains/$DOMAIN_NAME Starting the Node Manager I start node manager with a nohup to stop it exiting when the script terminates and I redirect the standard output and standard error to a file in a logs directory. cd $DOMAIN_HOME echo "#########################" echo "# Starting Node Manager #" echo "#########################" nohup $WL_HOME/server/bin/startNodeManager.sh >logs/NodeManager.out 2>&1 & Starting the Admin Server I had problems starting the Admin Server from Node Manager so I decided to start it using the command line script.  I again use nohup and redirect output. echo "#########################" echo "# Starting Admin Server #" echo "#########################" nohup ./startWebLogic.sh >logs/AdminServer.out 2>&1 & Starting the Managed Servers I then used WLST (WebLogic Scripting Tool) to start the managed servers.  First I waited for the Admin Server to come up by putting a connect command in a loop.  I could have put the WLST commands into a separate script file but I wanted to reduce the number of files I was using and so used redirected input (here syntax). $ORACLE_HOME/common/bin/wlst.sh <<-EOF import time sleep=time.sleep print "#####################################" print "# Waiting for Admin Server to Start #" print "#####################################" while True:   try:     connect(adminServerName="AdminServer")     break   except:     sleep(10) I then start the SOA server and tell WLST to wait until it is started before returning.  If starting a cluster then the start command would be modified accordingly to start the SOA cluster. print "#######################" print "# Starting SOA Server #" print "#######################" start(name="soa_server1", block="true") I then start the BAM server in the same way as the SOA server. print "#######################" print "# Starting BAM Server #" print "#######################" start(name="bam_server1", block="true") EOF Finally I let people know the servers are up and wait for input in case I am running in a separate window, in which case the result would be lost without the read command. echo "#####################" echo "# SOA Suite Started #" echo "#####################" read -p "Hit <enter> to continue" Stopping the SOA Suite My script for shutting down the SOA Suite (available here)  is basically the reverse of my startup script.  After setting the environment I connect to the Admin Server using WLST and shut down the managed servers and the admin server.  Again the script would need modifying for a cluster. Stopping the Servers If I cannot connect to the Admin Server I try to connect to the node manager, in case the Admin Server is down but the managed servers are up. #!/bin/sh echo "###########################" echo "# Setting SOA Environment #" echo "###########################" export MW_HOME=~oracle/Middleware11gPS1 export WL_HOME=$MW_HOME/wlserver_10.3 export ORACLE_HOME=$MW_HOME/Oracle_SOA export DOMAIN_NAME=soa_std_domain export DOMAIN_HOME=$MW_HOME/user_projects/domains/$DOMAIN_NAME cd $DOMAIN_HOME $MW_HOME/Oracle_SOA/common/bin/wlst.sh <<-EOF try:   print("#############################")   print("# Connecting to AdminServer #")   print("#############################")   connect(username='weblogic',password='welcome1',url='t3://localhost:7001') except:   print "#########################################"   print "#   Unable to connect to Admin Server   #"   print "# Attempting to connect to Node Manager #"   print "#########################################"   nmConnect(domainName=os.getenv("DOMAIN_NAME")) print "#######################" print "# Stopping BAM Server #" print "#######################" shutdown('bam_server1') print "#######################" print "# Stopping SOA Server #" print "#######################" shutdown('soa_server1') print "#########################" print "# Stopping Admin Server #" print "#########################" shutdown('AdminServer') disconnect() nmDisconnect() EOF Stopping the Node Manager I stopped the node manager by searching for the java node manager process using the ps command and then killing that process. echo "#########################" echo "# Stopping Node Manager #" echo "#########################" kill -9 `ps -ef | grep java | grep NodeManager |  awk '{print $2;}'` echo "#####################" echo "# SOA Suite Stopped #" echo "#####################" read -p "Hit <enter> to continue" Stopping the Database Again my script for shutting down the database is the reverse of my start script.  It is available here.  The only change needed might be to the database name. #!/bin/sh echo "##############################" echo "# Setting Oracle Environment #" echo "##############################" . oraenv <<-EOF orcl EOF echo "######################" echo "# Stopping dbConsole #" echo "######################" emctl stop dbconsole echo "#####################" echo "# Stopping Listener #" echo "#####################" lsnrctl stop echo "#####################" echo "# Stopping Database #" echo "#####################" sqlplus / as sysdba <<-EOF shutdown immediate exit EOF read -p "Hit <enter> to continue" Cleaning Up Cleaning SOA Suite I often run tests and want to clean up all the log files.  The following script (available here) does this for the WebLogic servers in a given domain on a machine.  After setting the domain I just remove all files under the servers logs directories.  It also cleans up the log files I created with my startup scripts.  These scripts could be enhanced to copy off the log files if you needed them but in my test environments I don’t need them and would prefer to reclaim the disk space. #!/bin/sh echo "###########################" echo "# Setting SOA Environment #" echo "###########################" export MW_HOME=~oracle/Middleware11gPS1 export WL_HOME=$MW_HOME/wlserver_10.3 export ORACLE_HOME=$MW_HOME/Oracle_SOA export DOMAIN_NAME=soa_std_domain export DOMAIN_HOME=$MW_HOME/user_projects/domains/$DOMAIN_NAME echo "##########################" echo "# Cleaning SOA Log Files #" echo "##########################" cd $DOMAIN_HOME rm -Rf logs/* servers/*/logs/* read -p "Hit <enter> to continue" Cleaning Database I also created a script to clean up the dump files of an Oracle database instance and also the EM log files (available here).  This relies on the machine name being correct as the EM log files are stored in a directory that is based on the hostname and the Oracle SID. #!/bin/sh echo "##############################" echo "# Setting Oracle Environment #" echo "##############################" . oraenv <<-EOF orcl EOF echo "#############################" echo "# Cleaning Oracle Log Files #" echo "#############################" rm -Rf $ORACLE_BASE/admin/$ORACLE_SID/*dump/* rm -Rf $ORACLE_HOME/`hostname`_$ORACLE_SID/sysman/log/* read -p "Hit <enter> to continue" Summary Hope you find the above scripts useful.  They certainly stop me hanging around waiting for things to happen on my test machine and make it easy to run a test, change parameters, bounce the SOA Suite and clean the logs between runs so I can see exactly what is happening. Now I need to get that mower started…

    Read the article

  • Move Files from a Failing PC with an Ubuntu Live CD

    - by Trevor Bekolay
    You’ve loaded the Ubuntu Live CD to salvage files from a failing system, but where do you store the recovered files? We’ll show you how to store them on external drives, drives on the same PC, a Windows home network, and other locations. We’ve shown you how to recover data like a forensics expert, but you can’t store recovered files back on your failed hard drive! There are lots of ways to transfer the files you access from an Ubuntu Live CD to a place that a stable Windows machine can access them. We’ll go through several methods, starting each section from the Ubuntu desktop – if you don’t yet have an Ubuntu Live CD, follow our guide to creating a bootable USB flash drive, and then our instructions for booting into Ubuntu. If your BIOS doesn’t let you boot using a USB flash drive, don’t worry, we’ve got you covered! Use a Healthy Hard Drive If your computer has more than one hard drive, or your hard drive is healthy and you’re in Ubuntu for non-recovery reasons, then accessing your hard drive is easy as pie, even if the hard drive is formatted for Windows. To access a hard drive, it must first be mounted. To mount a healthy hard drive, you just have to select it from the Places menu at the top-left of the screen. You will have to identify your hard drive by its size. Clicking on the appropriate hard drive mounts it, and opens it in a file browser. You can now move files to this hard drive by drag-and-drop or copy-and-paste, both of which are done the same way they’re done in Windows. Once a hard drive, or other external storage device, is mounted, it will show up in the /media directory. To see a list of currently mounted storage devices, navigate to /media by clicking on File System in a File Browser window, and then double-clicking on the media folder. Right now, our media folder contains links to the hard drive, which Ubuntu has assigned a terribly uninformative label, and the PLoP Boot Manager CD that is currently in the CD-ROM drive. Connect a USB Hard Drive or Flash Drive An external USB hard drive gives you the advantage of portability, and is still large enough to store an entire hard disk dump, if need be. Flash drives are also very quick and easy to connect, though they are limited in how much they can store. When you plug a USB hard drive or flash drive in, Ubuntu should automatically detect it and mount it. It may even open it in a File Browser automatically. Since it’s been mounted, you will also see it show up on the desktop, and in the /media folder. Once it’s been mounted, you can access it and store files on it like you would any other folder in Ubuntu. If, for whatever reason, it doesn’t mount automatically, click on Places in the top-left of your screen and select your USB device. If it does not show up in the Places list, then you may need to format your USB drive. To properly remove the USB drive when you’re done moving files, right click on the desktop icon or the folder in /media and select Safely Remove Drive. If you’re not given that option, then Eject or Unmount will effectively do the same thing. Connect to a Windows PC on your Local Network If you have another PC or a laptop connected through the same router (wired or wireless) then you can transfer files over the network relatively quickly. To do this, we will share one or more folders from the machine booted up with the Ubuntu Live CD over the network, letting our Windows PC grab the files contained in that folder. As an example, we’re going to share a folder on the desktop called ToShare. Right-click on the folder you want to share, and click Sharing Options. A Folder Sharing window will pop up. Check the box labeled Share this folder. A window will pop up about the sharing service. Click the Install service button. Some files will be downloaded, and then installed. When they’re done installing, you’ll be appropriately notified. You will be prompted to restart your session. Don’t worry, this won’t actually log you out, so go ahead and press the Restart session button. The Folder Sharing window returns, with Share this folder now checked. Edit the Share name if you’d like, and add checkmarks in the two checkboxes below the text fields. Click Create Share. Nautilus will ask your permission to add some permissions to the folder you want to share. Allow it to Add the permissions automatically. The folder is now shared, as evidenced by the new arrows above the folder’s icon. At this point, you are done with the Ubuntu machine. Head to your Windows PC, and open up Windows Explorer. Click on Network in the list on the left, and you should see a machine called UBUNTU in the right pane. Note: This example is shown in Windows 7; the same steps should work for Windows XP and Vista, but we have not tested them. Double-click on UBUNTU, and you will see the folder you shared earlier! As well as any other folders you’ve shared from Ubuntu. Double click on the folder you want to access, and from there, you can move the files from the machine booted with Ubuntu to your Windows PC. Upload to an Online Service There are many services online that will allow you to upload files, either temporarily or permanently. As long as you aren’t transferring an entire hard drive, these services should allow you to transfer your important files from the Ubuntu environment to any other machine with Internet access. We recommend compressing the files that you want to move, both to save a little bit of bandwidth, and to save time clicking on files, as uploading a single file will be much less work than a ton of little files. To compress one or more files or folders, select them, and then right-click on one of the members of the group. Click Compress…. Give the compressed file a suitable name, and then select a compression format. We’re using .zip because we can open it anywhere, and the compression rate is acceptable. Click Create and the compressed file will show up in the location selected in the Compress window. Dropbox If you have a Dropbox account, then you can easily upload files from the Ubuntu environment to Dropbox. There is no explicit limit on the size of file that can be uploaded to Dropbox, though a free account begins with a total limit of 2 GB of files in total. Access your account through Firefox, which can be opened by clicking on the Firefox logo to the right of the System menu at the top of the screen. Once into your account, press the Upload button on top of the main file list. Because Flash is not installed in the Live CD environment, you will have to switch to the basic uploader. Click Browse…find your compressed file, and then click Upload file. Depending on the size of the file, this could take some time. However, once the file has been uploaded, it should show up on any computer connected through Dropbox in a matter of minutes. Google Docs Google Docs allows the upload of any type of file – making it an ideal place to upload files that we want to access from another computer. While your total allocation of space varies (mine is around 7.5 GB), there is a per-file maximum of 1 GB. Log into Google Docs, and click on the Upload button at the top left of the page. Click Select files to upload and select your compressed file. For safety’s sake, uncheck the checkbox concerning converting files to Google Docs format, and then click Start upload. Go Online – Through FTP If you have access to an FTP server – perhaps through your web hosting company, or you’ve set up an FTP server on a different machine – you can easily access the FTP server in Ubuntu and transfer files. Just make sure you don’t go over your quota if you have one. You will need to know the address of the FTP server, as well as the login information. Click on Places > Connect to Server… Choose the FTP (with login) Service type, and fill in your information. Adding a bookmark is optional, but recommended. You will be asked for your password. You can choose to remember it until you logout, or indefinitely. You can now browse your FTP server just like any other folder. Drop files into the FTP server and you can retrieve them from any computer with an Internet connection and an FTP client. Conclusion While at first the Ubuntu Live CD environment may seem claustrophobic, it has a wealth of options for connecting to peripheral devices, local computers, and machines on the Internet – and this article has only scratched the surface. Whatever the storage medium, Ubuntu’s got an interface for it! Similar Articles Productive Geek Tips Backup Your Windows Live Writer SettingsMove a Window Without Clicking the Titlebar in UbuntuRecover Deleted Files on an NTFS Hard Drive from a Ubuntu Live CDCreate a Bootable Ubuntu USB Flash Drive the Easy WayReset Your Ubuntu Password Easily from the Live CD TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows Tech Fanboys Field Guide Check these Awesome Chrome Add-ons iFixit Offers Gadget Repair Manuals Online Vista style sidebar for Windows 7 Create Nice Charts With These Web Based Tools Track Daily Goals With 42Goals

    Read the article

  • SOA Community Newsletter: nouvelle lettre !

    - by mseika
    SOA PARTNER COMMUNITY NEWSLETTERAUGUST 2012 Dear SOA partner community member Have you submitted your feedback on SOA Partner Community Survey 2012? This is the last chance to participate in the survey. We recommend you to complete the survey and help us to improve our SOA Community. Thanks to all attendees and trainers for their participation in the excellent Fusion Middleware Summer Camps held in Lisbon and Munich. I would also like to thank you for the great feedback and the nice reports provided by AMIS Technology Blog & Middleware by Link Consulting. Most of our courses have been overbooked, if you did not get a chance or missed it, we offer a wide range of online training and the course material. Key take-away from the advanced BPM course is to become an expert in ADF. Here is the course from Grant Ronald Learn Advanced ADF online available. The Link Consulting Team became experts in SOA Governance with EAMS and Oracle Enterprise Repository! We always encourage our community members to share their best practices and are very keen to publish it. Please let us know if you want to share your best practices through this medium.We encourage you to make use of the Specialization benefits - this month we are giving an opportunity to Promote Your SOA & BPM Events. Jürgen KressOracle SOA & BPM Partner Adoption EMEA NEW CONTENT Presentations & Training material OFM Summer CampsPromote Your SOA & BPM Events Advanced ADF Online, For Free By Grant BPM 11g Customer Stories & Solution Catalog & Process Accelerators Delivering SOA Governance with EAMS by Link Consulting Team WebLogic Server Provisioning and Patching News from our Partners & CommunityUpdated material by Oracle Connect and Network SOA Blogs SOA on Facebook SOA on LinkedIn SOA on Twitter Mix SOA Forum SOA Workspace PRESENTATIONS & TRAINING MATERIAL OFM SUMMER CAMPS Thanks to all attendees who invested their time and utilized the opportunity to attend the Summer Camps! Due to high demand of our most of the trainings, we had a long waiting list with more numbers of partners who are keen to attend it. We would like to give our special thanks to all trainers, who delivered excellent workshops! Most of the presentations and course material have been posted on our SOA Community Workspaceand WebLogic Community Workspace. You can access the content only if you are a registered community member. To register for the SOA Community please click here. You can register for the WebLogic Community here. To find out the first impressions of the event please visit our Facebook pages:www.facebook.com/WebLogicCommunity &www.facebook.com/soacommunity or Picasa AlbumThanks for the excellent blog posts from AMIS Technology Blog & Middleware by Link Consulting. Let us know if you published a twitter blog on@soacommunity & @wlscommunity. We will be pleased to publish it in our Newsletters. BPM Course Quotes “Its always easy, if you know, what you are doing” - Torsten Winterberg, Opitz“ The best ideas are the ideas from the best” - Filipe Sequeria, Primesoft “Best invest in the education in the last 12 months” - Richard Schaller, IPT “Practice best practice with the best instructor” - Graham Lamond Capgemini “If you have basic BPM knowledge, this is the course to really mater it” - Diogo Henriques Link Consulting “Very good trainers lot of work. Lot of fun as well” - Matthias Gris Workflow Factory “If you like to accelerate in Oracle come to the training to bring it all together” - Marcel van der Glind, Amis ADF Course Quotes "Excellent training, great opportunity to network!" - Frank Houweling, Amis "Lots of fun and good ideas" - Ana Santiago, GFI "Learn ADF, worth it Fusion Apps is the future" - Miguel Delgadillo, STO Consulting "The best way to learn Fusion Middleware from the #1" Alexandro Montantes, STO Consulting "Be advanced to to be the first” - Dimitar Petrov Fadata "Great opportunity to suck all the knowledge out of some very experienced product managers” - Wilfred von der Deijl, The Future Group WebLogic Course Quotes “Oracle trainings are the best” - Pedro Neto Novobas“ "Excellent training, well organized” - Pedro Antunh, Capgemini “This course dives you into Oracle WebLogic giving you a quick start on benefiting from Fusion Apps” - Leonardo Fernandes, Outsystems Additional Quotes “Thanks a lot again for organizing such a great and informative Summer Camp. Both training and networking were organized very professionally. I have gained tons of very useful Info, which will definitely help to increase quality of our future projects.” - Daniel Fasko fss-group.com I didn’t get the chance yesterday to thank you for a most enjoyable and thoroughly educational time I had in Munich over the last few days.” - Jeroen Bakker Ordina “Just to congratulate you on a great event, not only today but also in the previous days of training. As we know, a very good organization and, as a native Portuguese that knows Lisbon very good, a nice choice of places to visit. Looking forward to come again next year.” Pedro Miguel Neto, Novobase PROMOTE YOUR SOA & BPM EVENTS The Partner Event Publisher has just been made available to all SOA & BPM specialized partners in EMEA. Partners now have the opportunity to publish their events to theOracle.com/events site and spread the word on their upcoming live in-person and/or live webcast events. See the demo below and click here to read more information. ADVANCED ADF ONLINE, FOR FREE BY GRANT The second part of the advanced ADF online eCourse is Live now! This covers the advanced topics of region and region interaction as well as getting down and dirty with some of the layout features of ADF Faces, skinning and DVT components. The aim of this course is to give you a self-paced learning aid which covers the more advanced topics of ADF development. The content is developed by Product Management and our Curriculum development teams and is based on advanced training material we have been running internally for about 18 months. We will get started on the next chapter, but in the meantime, please have a look at chapters one and two. Back to top BPM 11G CUSTOMER STORIES & SOLUTION CATALOG & PROCESS ACCELERATORS Stories Everyone loves a good story on planning or implementing a BPM strategy. Everyone wants to hear how it was done before?, what worked?, what was achieved? If you have achieved success with BPM, we are very keen to hear your stories and examples of how your customers use it. We receive lots of requests from people who are thinking of using BPM to solve a specific problem or in combination with a specific technology to talk to someone who has done it before. These stories are invaluable. Drop down the details of anything you think is relevant with a bit of detail and we will follow up on it. As one good deed deserves another, we will do our best to give you stories if you need them to show that where you are going, others have treaded before. Send your stories to us using this e-mail link and we will share them among other like minded people. Solution Catalogue This summer, Oracle is launching a solution catalogue specifically intended for partners. If you have delivered a successful implementation in BPM and think it could be reused and applied again in a similar scenario in the same industry or in a similar environment, then we ware keen to know about it and will add it to the solution catalogue. The solution catalogue will showcase successful BPM solutions both inside and outside Oracle. Be in touch with us on this e-mail link and we will make sure to add your solution. Process AcceleratorsFinally if you have specific processes that you are expert on, you have implemented at a customer and you want to work with us on getting these productised, then we would love to know about it. The process accelerator programme is explained in the most recent SOA/BPM Community Newsletter but again feel free to contact us if you want to get involved. Good luck with BPM and let us know how we can help. Barry O'Reilly Director BPM [email protected] DELIVERING SOA GOVERNANCE WITH EAMS BY LINK CONSULTING TEAM In the last 12 years Link Consulting has been making its presence in specific areas such as Governance and Architecture, both in terms of practices and methodologies, products, know-how and technological expertise. The Enterprise Architecture Management System - Oracle Enterprise Edition (EAMS - OER Edition) is the result of this experience and combines the architecture management solution with OER in order to deliver a product specialized for SOA Governance that gathers the better of two worlds in solution that enables SOA Governance projects, initiatives and programs. Enterprise Architecture Management System Enterprise Architecture Management System (EAMS), is an automation based solution that enables the efficient management of Enterprise Architectures. The solution uses configured enterprise repositories and takes advantages of its features to provide automation capabilities to the users. EAMS provides capabilities to create/customize/analyze repository data, architectural blueprints, reports and analytic charts. Oracle Enterprise Repository Oracle Enterprise Repository (OER) is one of the major and central elements of the Oracle SOA Governance solution. Oracle Enterprise Repository provides the tools to manage and govern the metadata for any type of software asset, from business processes and services to patterns, frameworks, applications, components, and models. OER maps the relationships and inter-dependencies that connect those assets to improve impact analysis, promote and optimize their reuse, and measure their impact on the bottom line. It provides the visibility, feedback, controls, and analytics to keep your SOA on track to deliver business value. The intense focus on automation helps to overcome barriers to SOA adoption and streamline governance throughout the lifecycle. Core capabilities of the OER include: Asset Management Asset Lifecycle Management Usage Tracking Service Discovery Version Management Dependency Analysis Portfolio Management EAMS - OER Edition The solution takes the advantages and features from both products and combines them in a symbiotic tool that enhances the quality of SOA Governance Initiatives and Programs. EAMS is able to produce a vast number of outputs by combining its analytical engine, SOA-specific configurations and the assets in OER and other related tools, catalogs and repositories. The configurations encompass not only the extendable parametrization of the metadata but also fully configurable blueprints, PowerPoint reports, charts and queries. The SOA blueprints The solution comes with a set of predefined architectural representations that help the organization better perceive their SOA landscape. More blueprints can be easily created in order to accommodate the organizations needs in terms of detail, audience and metadata. Charts & Dashboards The solution encompasses a set of predefined charts and dashboards that promote a more agile way to control and explore the assets. Time Based Visualization All representations are time bound, and with EAMS - OER you can truly govern SOA with a complete view of the Past, Present and Future; The solution delivers Gap Analysis, a project oriented approach while taking into consideration the As-Was, As-Is an To-Be. Time based visualization differentiating factors: Extensive automation and maintenance of architectural representations Organization wide solution. Easy access and navigation to and between all architectural artifacts and representations. Flexible meta-model, customization and extensibility capabilities. Lifecycle management and enforcement of the time dimension over all the repository content. Profile based customization. Comprehensive visibility Architectural alignment Friendly and striking user interfaces For more information on EAMS visit us here. For more information on SOA visit us here. WEBLOGIC SERVER PROVISIONING AND PATCHING For access to the Oracle demo systems please visit OPN and talk to your Partner Expert.SOA Suite and BPM Suite runs on WebLogic! We are pleased to announce the availability of a WebLogic Server Management demo that showcases some of the key provisioning and patching capabilities of WebLogic Server Management Pack Enterprise Edition (EE). To learn more about these features - as well as other features of the pack - please visit the pack's saleskit page.Demo Highlights The demo showcases the following capabilities: Patching Oracle WebLogic Servers Standardizing WebLogic Server Patch Rollouts Creating a WebLogic Domain Provisioning Profile Cloning a WebLogic Domain from a Provisioning Profile Deploying a Java EE Application Scaling Out an Oracle WebLogic Cluster Demo Instructions Go to the DSS website for Oracle Partners. On the Standard Demo Launchpad page, under the “Software Lifecycle Automation” section, click on the link “EM Cloud Control 12c WLS Provisioning and Patching” (tagged as “NEW”). Specific demo launchpad page contains a link to the detailed demo script with instructions on how to show the demo.

    Read the article

  • From Bluehost to WP Engine, My WordPress Story

    - by thatjeffsmith
    This is probably the longest blog post I’ve written in a LONG time. And if you’re used to coming here for the Oracle stuff, this post is not about that. It’s about my blog, and the stuff under the hood that makes it run, AKA WordPress. If you want to skip to the juicy stuff, then use these shortcuts: My Site Slowed Down How I Moved to WP Engine How WP Engine ‘Hooked’ Me Why WP Engine? I started thatJeffSmith.com on May 28th, 2010. I had been already been blogging for several years, but a couple of really smart people I respected (Andy, Brent – thanks again!) suggested that I take ownership of my content and begin building my personal brand. I thought that was a good idea, and so I signed up for service with bluehost. Bluehost makes setting up a WordPress site very, very easy. And, they continued to be easy to work with for the past 2 years. I would even recommend them to anyone looking to host their own WordPress install/site. For $83.40, I purchased a year’s worth of service and my domain name registration – a very good value. And then last year I paid $107.40 for another year’s services. And when that year expired I paid another $190.80 for an additional two year’s service in advance. I had been up to that point, getting my money’s worth. And then, just a few weeks ago… My Site Slowed to a Crawl That spike was from an April Fool's Day Post, I think Why? Well, when I first started blogging, I had the same problem that most beginner bloggers have – not many readers. In my first year of blogging, I think the highest number of readers on a single day was about 125. I remember that day as I was very excited to break 100! Bluehost was very reliable, serving up my content with maybe a total of 3-4 outages in the past 2 years. Support was usually very prompt with answers and solutions, and I love their ‘Chat now’ technology – much nicer than message boards only or pay-to-talk phone support. In the past 6 months however, I noticed a couple of things: daily traffic was increasing – woohoo! my service was experiencing severe CPU throttling – doh! To be honest, I wasn’t aware the throttling was occuring, but I did know that the response time of my blog was starting to lag. Average load times were approaching 20-30 seconds. Not good when good sites are loading in 5 seconds or less. And just this past week, in getting ready to launch a new website for work that sucked in an RSS feed from my blog, the new page was left waiting for more than a minute. Not good! In fact my boss asked, why aren’t you blogging on Blogger? Ugh. I tried a few things to fix the problem: I paid for a premium WordPress theme – Themify’s Grido (thanks to @SQLRockstar for the heads-up) I installed a couple of WP caching plugins I read every WP optimization blog post I could get my greedy little eyes on However, at the same time I was also getting addicted to WordPress bloggers talking about all the cool things you could do with your blog. As a result I had at one point about 30 different plugins installed. WordPress runs on MySQL, and certain queries running via these plugins were starving for CPU. Plugins that would be called every page load meant that as more people clicked on my site, the more CPU I needed. I’m not stupid, so I eventually figured out that maybe less plugins was better, and was able to go down to just 20. But still, the site was running like a dog. CPU Throttling, makes MySQL wait to run a query Bluehost runs shared servers. Your site runs on the same box that several hundred (or thousand?) other services are running on. If you take more CPU than they think you should have, they will limit your service by making you stand in line for CPU, AKA ‘throttling.’ This is not bad. This business model allows them to serve many, many users for a very fair price. It works great until, well, until it doesn’t. I noticed in the last week that for every minute of service, I was being throttled between 60 and 300 seconds. If there were 5 MySQL processes running, then every single one of them were being held in check. The blog visitor notice this as their page requests would take a minute or more to be answered. Bluehost unfortunately doesn’t offer dedicated server hosting, so there was no real upgrade path for me follow and remain one of their customers. So what was I to do? Uninstall every plugin and hope the site sped up? Ask for people to take turns on my blog? I decided to spend my way out of the problem. I signed up for service with WP Engine and moved ThatJeffSmith.com The first 2 months are free, and after that it’s about $29/month to run my site on their system. My math tells me that’s a good bit more expensive than what Bluehost was charging me – to the tune of about 300% more a month. Oh, and I should just say that my blog is a personal blog even though I talk about work stuff here. I don’t get paid for blogging, I don’t sell ads, and I don’t expense the service fees – this is my personal passion. So is it worth it? In the first 4 days, it seems to be totally worth it. Load times have gone from 20-30 seconds to less than 5 seconds. A few folks have told me via Twitter that they notice faster page loads. I anticipate this will indirectly lead to more traffic as Google penalizes you in search results if your site is too slow, and of course some folks won’t even bother waiting more than 5-10 seconds. I noticed right away that writing posts, uploading pictures, and just using the WordPress dashboard in general was much more responsive. So writing is less of a chore now, which means I won’t have a good reason not to write How I Moved to WP Engine I signed up for the service and registered my domain. I then took a full export of my ‘old’ site by doing a FTP GET of all my files, then did a MySQL database backup, exported my WordPress Theme settings to a .zip file, and then finally used the WordPress ‘Export’ feature. I then used the WordPress ‘Import’ on the new site to load up my posts. Then I uploaded the theme .zip package from Themify. Then I FTP’d the ‘wp-content’ directory up to my new server using SFTP (WP Engine only supports secure FTP – good on them!) Using a temporary URL to see my new site, I was able to confirm that everything looked mostly OK – I’ll detail the challenges and issues of fixing the content next – but then it was time to ‘flip the switch.’ I updated the IP address that the DNS lookup tables use to route traffic to my new server. In a matter of minutes the DNS servers around the world were updated and it was time to see the new site! But It Was ‘Broken’ I had never moved a website before, and in my rush to update the DNS, I had changed the records without really finding out what I was supposed to do first. After re-reading the directions provided by WP Engine and following the guidance of their support engineer, I realized I had needed to set the CNAME (Alias) ‘www’ record to point to a different URL than the ‘www.thatjeffsmith.com’ entry I had set. Once corrected the site was up and running in less than a minute. Then It Was Only Mostly Broken Many of my plugins weren’t working. Apparently just ftp’ing the wp-content directory up wasn’t the proper way to re-install the plugin. I suspect file permissions or file ownership wasn’t proper. Some plug-ins were working, many had their settings wiped to the defaults, and a few just didn’t work again. I had to delete the directory of the plug-in manually via SFTP, and then use the WP Dashboard to install it from scratch. And here was my first ‘lesson’ – don’t switch the DNS records until you’ve completely tested your new site. I wasn’t able to navigate the old WP console to review my plug-in settings. Thankfully I was able to use the Wayback Machine to reverse engineer some things, and of course most plug-ins aren’t that complicated to setup to begin with. An example of one that I had to redo from scratch is the ‘Twitter @Anywhere Plus’ plugin that I use to create the form that allows folks to tweet a post they enjoyed at the end of each story. How WP Engine ‘Hooked’ Me I actually signed up with another provider first. They ranked highly in Google searches and a few Tweeps recommended them to me. But hours after signing up and I still didn’t have sever reyady, I was ready to give up on them. They offered no chat or phone support – only mail and message boards. And the message boards were rife with posts about how the service had gone downhill in the past 6 months. To their credit, they did make it easy to cancel, although I did have to do so via email as their website ‘cancel’ button was non-existent. Within minutes of activating my WP Engine account I had received my welcome message and directions on how to get started. I was able to see my staged website right away. They also did something very cool before I even got started – they looked at my existing site and told me by how much they could improve its performance. The proof is in the web pudding. I like this for a few reasons, but primarily I liked their business model. It told me they knew what they were doing, and that they were willing to put their money where their mouth was. This was further evident by their 60-day money back guarantee. And if I understand it correctly, they don’t even take your money until after that 60 day period is over. After a day, I was welcomed by the WP Engine social media team, and was given the opportunity to subscribe to their newsletter and follow their account on Twitter. I noticed their Twitter team is sure to post regular WordPress tips several times a day. It’s not just an account that’s setup for the sake of having a Twitter presence. These little things add up and give me confidence in my decision to choose them as my hosting partner. ‘Partner’ – that’s a lot nicer word than just ‘service provider,’ isn’t it? Oh, and they offered me a t-shirt. Don’t ever doubt the power of a ‘free’ t-shirt! How awesome is this e-mail, from a customer perspective? I wasn’t really expecting any of this. Exceeding expectations before I have even handed over a single dollar seems like a pretty good business plan. This is how you treat customers. Love them to death, and they reward you with loyalty. But Jeff, You Skipped a Piece Here, Why WP Engine? I found them on one of those ‘Top 10′ list posts, and pulled up their webpage. I noticed they offered a specialized service – they host WordPress installs, and that’s it. Their servers are tuned specifically for running WordPress. They had in bolded text, things like ‘INSANELY FAST. INFINITELY SCALABLE.’ and ‘LIGHTNING SPEED.’ And then they offered insurance against hackers and they took care of automatic backups and restores. The only drawbacks I have noticed so far relate to plugins I used that have been ‘blacklisted.’ In order to guarantee that ‘lightning’ speed, they have banned the use of the CPU-suckiest plugins. One of those is the ‘Related Posts’ plugin. So if you are a subscriber and are reading this in your email, you’ll notice there’s no links back to my blog to continue reading other related stories. Since that referral traffic is very small single-digit for my site, I decided that I’m OK with that. I’d rather have the warp-speed page loads. Again, I think that will lead to higher traffic down the road. In 50+ days I will need to decide if WP Engine is a permanent solution. I’ll be sure to update this post when that time comes and let y’all know how it turns out.

    Read the article

  • CodePlex Daily Summary for Wednesday, June 15, 2011

    CodePlex Daily Summary for Wednesday, June 15, 2011Popular ReleasesTerraria World Viewer: Version 1.2: Update June 15thNew User Interface Map drawing will not cause the program to freeze anymore Fixed the "Draw Symbols" (now called "Markers") checkbox not having any effectMVC Controls Toolkit: Mvc Controls Toolkit 1.1.5 RC: Added Extended Dropdown allows a prompt item to be inserted as first element. RequiredAttribute, if present, trggers if no element is chosen Client side javascript function to set/get the values of DateTimeInput, TypedTextBox, TypedEditDisplay, and to bind/unbind a "change" handler The selected page in the pager is applied the attribute selected-page="selected" that can be used in the definition of CSS rules to style the selected page items controls now interpret a null value as an empr...Umbraco CMS: Umbraco CMS 5.0 CTP 1: Umbraco 5 Community Technology Preview Umbraco 5 will be the next version of everyone's favourite, friendly ASP.NET CMS that already powers over 100,000 websites worldwide. Try out our first CTP of version 5 today! If you're new to Umbraco and would like to get a quick low-down on our popular and easy-to-learn approach to content management, check out our intro video here. What's in the v5 CTP box? This is a preview version of version 5 and includes support for the following familiar Umbr...Ribbon Browser for Microsoft Dynamics CRM 2011: Ribbon Browser (1.0.514.30): Initial releaseTerrariViewer: TerrariViewer v3.0 [Terraria Inventory Editor]: In this version, I did an overhaul of the GUI of the program. The only pop-up window you will receive now is a warning box for for when you click on the "Delete" button. Everything has been integrated into the tabs on the form. I added every item included with v1.0.4 of Terraria and added the option to set inventory/bank slots to "No Item". This WILL work with characters that have not been opened in v1.0.4patterns & practices: Project Silk: Project Silk Community Drop 11 - June 14, 2011: Changes from previous drop: Many code changes: please see the readme.mht for details. New "Client Data Management and Caching" chapter. Updated "Application Notifications" chapter. Updated "Architecture" chapter. Updated "jQuery UI Widget" chapter. Updated "Widget QuickStart" appendix and code. Guidance Chapters Ready for Review The Word documents for the chapters are included with the source code in addition to the CHM to help you provide feedback. The PDF is provided as a separat...Orchard Project: Orchard 1.2: Build: 1.2.41 Published: 6/14/2010 How to Install Orchard To install Orchard using Web PI, follow these instructions: http://www.orchardproject.net/docs/Installing-Orchard.ashx. Web PI will detect your hardware environment and install the application. Alternatively, to install the release manually, download the Orchard.Web.1.2.41.zip file. http://orchardproject.net/docs/Manually-installing-Orchard-zip-file.ashx The zip contents are pre-built and ready-to-run. Simply extract the contents o...PowerGUI Visual Studio Extension: PowerGUI VSX 1.3.4: Changes - Got rid of suppressed exceptions on assemblies loading at project startup - Fixed Issue #28535 "No Print Support" - Enabled IntelliSence commands wich are supported by ActiPro Syntax Editor control: ToggleBookmark, NextBookmark, PreviousBookmark, ShowMemberList - Added missing Import directives in PS Script project template - Fixed exception occurring on debug start - Fixed an issue: after creating a new PS project, a debugging session hung being run for the second timeSnippet Designer: Snippet Designer 1.4.0: Snippet Designer 1.4.0 for Visual Studio 2010 Change logSnippet Explorer ChangesReworked language filter UI to work better in the side bar. Added result count drop down which lets you choose how many results to see. Language filter and result count choices are persisted after Visual Studio is closed. Added file name to search criteria. Search is now case insensitive. Snippet Editor Changes Snippet Editor ChangesAdded menu option for the $end$ symbol which indicates where the c...SizeOnDisk: 1.0.9.0: Can handle Right-To-Left languages (issue 316) About box (issue 310) New language: Deutsch (thanks to kyoka) Fix: file and folder context menuDropBox Linker: DropBox Linker 1.1: Added different popup descriptions for actions (copy/append/update/remove) Added popup timeout control (with live preview) Added option to overwrite clipboard with the last link only Notification popup closes on user click Notification popup default timeout increased to 3 sec. Added codeplex link to about .NET Framework 4.0 Client Profile requiredWCF Community Site: WCF Express Interop Bindings 1.0: Welcome to the first release of the WCF Express Interop BindingsThis project provides a starter kit for WCF service developers wishing to connect with Java clients in WebSphere, WebLogic, Metro and Apache. It supports security, MTOM and RM features. For more information see the Landing page We welcome your feedback (Topic: Interop Bindings). Please submit any feature requests / bug fixes via the issue tracker. FeaturesVSIX Installer WCF Bindings for Oracle WebLogic, Oracle Metro, IBM WebS...Mobile Device Detection and Redirection: 1.0.4.1: Stable Release 51 Degrees.mobi Foundation is the best way to detect and redirect mobile devices and their capabilities on ASP.NET and is being used on thousands of websites worldwide. We’re highly confident in our software and we recommend all users update to this version. Changes to Version 1.0.4.1Changed the BlackberryHandler and BlackberryVersion6Handler to have equal CONFIDENCE values to ensure they both get a chance at detecting BlackBerry version 4&5 and version 6 devices. Prior to thi...Kouak - HTTP File Share Server: Kouak Beta 3 - Clean: Some critical bug solved and dependecy problems There's 3 package : - The first, contains the cli server and the graphical server. - The second, only the cli server - The third, only the graphical client. It's a beta release, so don't hesitate to emmit issue ;pRawr: Rawr 4.1.06: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr AddonWe now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including bag and bank items) like Char...AcDown????? - Anime&Comic Downloader: AcDown????? v3.0 Beta6: ??AcDown?????????????,?????????????,????、????。?????Acfun????? ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??v3.0 Beta6 ?????(imanhua.com)????? ???? ?? ??"????","?????","?????","????"?????? "????"?????"????????"?? ??????????? ?????????????? ?????????????/???? ?? ????Windows 7???????????? ????????? ?? ????????????? ???????/??????????? ???????????? ?? ?? ?????(imanh...Pulse: Pulse Beta 2: - Added new wallpapers provider http://wallbase.cc. Supports english search, multiple keywords* - Improved font rendering in Options window - Added "Set wallpaper as logon background" option* - Fixed crashes if there is no internet connection - Fixed: Rewalls downloads empty images sometimes - Added filters* Note 1: wallbase provider supports only english search. Rewalls provider supports only russian search but Pulse automatically translates your english keyword into russian using Google Tr...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.7: Version: 2.0.0.7 (Milestone 7): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...SimplePlanner: v2.0b: For 2011-2012 Sem 1 ???2011-2012 ????Visual Studio 2010 Help Downloader: 1.0.0.3: Domain name support for proxy Cleanup old packages bug Writing to EventLog with UAC enabled bug Small fixes & RefactoringNew Projects360U: 360UAd Configuration + Rotator for Windows Phone: Ad Configuration and Rotator for Windows Phone is a set of classes and controls which allow you to remotely manage advertising providers used inside your Windows Phone application. Advertising providers can be plugged in on an 'as needed' so application only ship with the providers being used.CommerceShopSystem: CommerceShopSystemContour strikes again: Collection of extensions for the Umbraco Contour form builderFarseer Physics & GLEED2D Link: This project includes c# files usable to implement the Farseer Physics engine in a level created using GLEED2D.FolderComparer: This DLL holds an extension of the DirectoryInfo class. It contains a logic that helps compare the contents of two folders. HierList Hierarchial Outline ASP.NET Server Control ( using UL or OL and LI ): The HierList ASP WebControl generates a hierarchial list using the UL, OL, and LI html tags. Images Organizator: A C# .Net program that organizes all pictures in a folder by date.KiggDemo: i study kiggLavieOrnamentos: LavieOrnamentos is a MVC project written in C# for a standard business website. I plan to use it as a base for a bigger project, a standard business site framework targeting small companies that just want to display their products and latest news.Multiple Choice Training Application: This is an ASP.Net (VB.Net) Web based training application. This application can be configured to ask multiple choice questions for multiple groups, score based on percentage, create completion certificates and be completely managed via a web interface,Orchard Windows Authentication: This module allows Windows domain users to be authenticated in Orchard.Party Estimator: Party Estimator is a training project based on requirements from O'Reilly's _Head First C#_, and is not intended for widespread use. SharePoint Enforcer - Ensuring large sites comply with standards: SharePoint Enforcer is a utility that aids in governance of large SharePoint sites to ensure that the sites comply with various business rules that have been created to keep the site from growing out of control.SharePoint WarmUp Tool (Claims+FBA): This tools is for warming up (waking) SharPoint sites. It addresses the issue of a 403 forbidden error when the SharePoint web app is in claims mode and FBA. It uses Windows authentication to warm up the sites and bypasses the FBA login redirection causing the 403 forbidden error.Super Mario Limitless: Super Mario Limitless is an in-production Super Mario level engine. It allows you to play your own levels and worlds, play others' levels and worlds, and even play online. With limitless features, you'll spend hours playing and creating.VB.NET ASP.NET MVC 2 - Music Store: This project is a port using the VB language with ASP.NET MVC 2 of the MusicStore application that can be found at : http://mvcmusicstore.codeplex.com/ Veni, Vedi, Velcro...: A personal phone 7 social media app that shows basic elements of design, ad model, panorama etc.XMLServiceMonitor: A Windows service (VB.Net) that allows the monitoring of failure of Servers, Services, Applications, Scheduled Tasks and SQL Jobs. The service is configurable with simple XML files and sends out email notifications of failures.

    Read the article

  • Calculating the Size (in Bytes and MB) of a Oracle Coherence Cache

    - by Ricardo Ferreira
    The concept and usage of data grids are becoming very popular in this days since this type of technology are evolving very fast with some cool lead products like Oracle Coherence. Once for a while, developers need an programmatic way to calculate the total size of a specific cache that are residing in the data grid. In this post, I will show how to accomplish this using Oracle Coherence API. This example has been tested with 3.6, 3.7 and 3.7.1 versions of Oracle Coherence. To start the development of this example, you need to create a POJO ("Plain Old Java Object") that represents a data structure that will hold user data. This data structure will also create an internal fat so I call that should increase considerably the size of each instance in the heap memory. Create a Java class named "Person" as shown in the listing below. package com.oracle.coherence.domain; import java.io.Serializable; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Random; @SuppressWarnings("serial") public class Person implements Serializable { private String firstName; private String lastName; private List<Object> fat; private String email; public Person() { generateFat(); } public Person(String firstName, String lastName, String email) { setFirstName(firstName); setLastName(lastName); setEmail(email); generateFat(); } private void generateFat() { fat = new ArrayList<Object>(); Random random = new Random(); for (int i = 0; i < random.nextInt(18000); i++) { HashMap<Long, Double> internalFat = new HashMap<Long, Double>(); for (int j = 0; j < random.nextInt(10000); j++) { internalFat.put(random.nextLong(), random.nextDouble()); } fat.add(internalFat); } } public String getFirstName() { return firstName; } public void setFirstName(String firstName) { this.firstName = firstName; } public String getLastName() { return lastName; } public void setLastName(String lastName) { this.lastName = lastName; } public String getEmail() { return email; } public void setEmail(String email) { this.email = email; } } Now let's create a Java program that will start a data grid into Coherence and will create a cache named "People", that will hold people instances with sequential integer keys. Each person created in this program will trigger the execution of a custom constructor created in the People class that instantiates an internal fat (the random amount of data generated to increase the size of the object) for each person. Create a Java class named "CreatePeopleCacheAndPopulateWithData" as shown in the listing below. package com.oracle.coherence.demo; import com.oracle.coherence.domain.Person; import com.tangosol.net.CacheFactory; import com.tangosol.net.NamedCache; public class CreatePeopleCacheAndPopulateWithData { public static void main(String[] args) { // Asks Coherence for a new cache named "People"... NamedCache people = CacheFactory.getCache("People"); // Creates three people that will be putted into the data grid. Each person // generates an internal fat that should increase its size in terms of bytes... Person pessoa1 = new Person("Ricardo", "Ferreira", "[email protected]"); Person pessoa2 = new Person("Vitor", "Ferreira", "[email protected]"); Person pessoa3 = new Person("Vivian", "Ferreira", "[email protected]"); // Insert three people at the data grid... people.put(1, pessoa1); people.put(2, pessoa2); people.put(3, pessoa3); // Waits for 5 minutes until the user runs the Java program // that calculates the total size of the people cache... try { System.out.println("---> Waiting for 5 minutes for the cache size calculation..."); Thread.sleep(300000); } catch (InterruptedException ie) { ie.printStackTrace(); } } } Finally, let's create a Java program that, using the Coherence API and JMX, will calculate the total size of each cache that the data grid is currently managing. The approach used in this example was retrieve every cache that the data grid are currently managing, but if you are interested on an specific cache, the same approach can be used, you should only filter witch cache will be looked for. Create a Java class named "CalculateTheSizeOfPeopleCache" as shown in the listing below. package com.oracle.coherence.demo; import java.text.DecimalFormat; import java.util.Map; import java.util.Set; import java.util.TreeMap; import javax.management.MBeanServer; import javax.management.MBeanServerFactory; import javax.management.ObjectName; import com.tangosol.net.CacheFactory; public class CalculateTheSizeOfPeopleCache { @SuppressWarnings({ "unchecked", "rawtypes" }) private void run() throws Exception { // Enable JMX support in this Coherence data grid session... System.setProperty("tangosol.coherence.management", "all"); // Create a sample cache just to access the data grid... CacheFactory.getCache(MBeanServerFactory.class.getName()); // Gets the JMX server from Coherence data grid... MBeanServer jmxServer = getJMXServer(); // Creates a internal data structure that would maintain // the statistics from each cache in the data grid... Map cacheList = new TreeMap(); Set jmxObjectList = jmxServer.queryNames(new ObjectName("Coherence:type=Cache,*"), null); for (Object jmxObject : jmxObjectList) { ObjectName jmxObjectName = (ObjectName) jmxObject; String cacheName = jmxObjectName.getKeyProperty("name"); if (cacheName.equals(MBeanServerFactory.class.getName())) { continue; } else { cacheList.put(cacheName, new Statistics(cacheName)); } } // Updates the internal data structure with statistic data // retrieved from caches inside the in-memory data grid... Set<String> cacheNames = cacheList.keySet(); for (String cacheName : cacheNames) { Set resultSet = jmxServer.queryNames( new ObjectName("Coherence:type=Cache,name=" + cacheName + ",*"), null); for (Object resultSetRef : resultSet) { ObjectName objectName = (ObjectName) resultSetRef; if (objectName.getKeyProperty("tier").equals("back")) { int unit = (Integer) jmxServer.getAttribute(objectName, "Units"); int size = (Integer) jmxServer.getAttribute(objectName, "Size"); Statistics statistics = (Statistics) cacheList.get(cacheName); statistics.incrementUnit(unit); statistics.incrementSize(size); cacheList.put(cacheName, statistics); } } } // Finally... print the objects from the internal data // structure that represents the statistics from caches... cacheNames = cacheList.keySet(); for (String cacheName : cacheNames) { Statistics estatisticas = (Statistics) cacheList.get(cacheName); System.out.println(estatisticas); } } public MBeanServer getJMXServer() { MBeanServer jmxServer = null; for (Object jmxServerRef : MBeanServerFactory.findMBeanServer(null)) { jmxServer = (MBeanServer) jmxServerRef; if (jmxServer.getDefaultDomain().equals(DEFAULT_DOMAIN) || DEFAULT_DOMAIN.length() == 0) { break; } jmxServer = null; } if (jmxServer == null) { jmxServer = MBeanServerFactory.createMBeanServer(DEFAULT_DOMAIN); } return jmxServer; } private class Statistics { private long unit; private long size; private String cacheName; public Statistics(String cacheName) { this.cacheName = cacheName; } public void incrementUnit(long unit) { this.unit += unit; } public void incrementSize(long size) { this.size += size; } public long getUnit() { return unit; } public long getSize() { return size; } public double getUnitInMB() { return unit / (1024.0 * 1024.0); } public double getAverageSize() { return size == 0 ? 0 : unit / size; } public String toString() { StringBuffer sb = new StringBuffer(); sb.append("\nCache Statistics of '").append(cacheName).append("':\n"); sb.append(" - Total Entries of Cache -----> " + getSize()).append("\n"); sb.append(" - Used Memory (Bytes) --------> " + getUnit()).append("\n"); sb.append(" - Used Memory (MB) -----------> " + FORMAT.format(getUnitInMB())).append("\n"); sb.append(" - Object Average Size --------> " + FORMAT.format(getAverageSize())).append("\n"); return sb.toString(); } } public static void main(String[] args) throws Exception { new CalculateTheSizeOfPeopleCache().run(); } public static final DecimalFormat FORMAT = new DecimalFormat("###.###"); public static final String DEFAULT_DOMAIN = ""; public static final String DOMAIN_NAME = "Coherence"; } I've commented the overall example so, I don't think that you should get into trouble to understand it. Basically we are dealing with JMX. The first thing to do is enable JMX support for the Coherence client (ie, an JVM that will only retrieve values from the data grid and will not integrate the cluster) application. This can be done very easily using the runtime "tangosol.coherence.management" system property. Consult the Coherence documentation for JMX to understand the possible values that could be applied. The program creates an in memory data structure that holds a custom class created called "Statistics". This class represents the information that we are interested to see, which in this case are the size in bytes and in MB of the caches. An instance of this class is created for each cache that are currently managed by the data grid. Using JMX specific methods, we retrieve the information that are relevant for calculate the total size of the caches. To test this example, you should execute first the CreatePeopleCacheAndPopulateWithData.java program and after the CreatePeopleCacheAndPopulateWithData.java program. The results in the console should be something like this: 2012-06-23 13:29:31.188/4.970 Oracle Coherence 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Loaded operational configuration from "jar:file:/E:/Oracle/Middleware/oepe_11gR1PS4/workspace/calcular-tamanho-cache-coherence/lib/coherence.jar!/tangosol-coherence.xml" 2012-06-23 13:29:31.219/5.001 Oracle Coherence 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Loaded operational overrides from "jar:file:/E:/Oracle/Middleware/oepe_11gR1PS4/workspace/calcular-tamanho-cache-coherence/lib/coherence.jar!/tangosol-coherence-override-dev.xml" 2012-06-23 13:29:31.219/5.001 Oracle Coherence 3.6.0.4 <D5> (thread=Main Thread, member=n/a): Optional configuration override "/tangosol-coherence-override.xml" is not specified 2012-06-23 13:29:31.266/5.048 Oracle Coherence 3.6.0.4 <D5> (thread=Main Thread, member=n/a): Optional configuration override "/custom-mbeans.xml" is not specified Oracle Coherence Version 3.6.0.4 Build 19111 Grid Edition: Development mode Copyright (c) 2000, 2010, Oracle and/or its affiliates. All rights reserved. 2012-06-23 13:29:33.156/6.938 Oracle Coherence GE 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Loaded Reporter configuration from "jar:file:/E:/Oracle/Middleware/oepe_11gR1PS4/workspace/calcular-tamanho-cache-coherence/lib/coherence.jar!/reports/report-group.xml" 2012-06-23 13:29:33.500/7.282 Oracle Coherence GE 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Loaded cache configuration from "jar:file:/E:/Oracle/Middleware/oepe_11gR1PS4/workspace/calcular-tamanho-cache-coherence/lib/coherence.jar!/coherence-cache-config.xml" 2012-06-23 13:29:35.391/9.173 Oracle Coherence GE 3.6.0.4 <D4> (thread=Main Thread, member=n/a): TCMP bound to /192.168.177.133:8090 using SystemSocketProvider 2012-06-23 13:29:37.062/10.844 Oracle Coherence GE 3.6.0.4 <Info> (thread=Cluster, member=n/a): This Member(Id=2, Timestamp=2012-06-23 13:29:36.899, Address=192.168.177.133:8090, MachineId=55685, Location=process:244, Role=Oracle, Edition=Grid Edition, Mode=Development, CpuCount=2, SocketCount=2) joined cluster "cluster:0xC4DB" with senior Member(Id=1, Timestamp=2012-06-23 13:29:14.031, Address=192.168.177.133:8088, MachineId=55685, Location=process:1128, Role=CreatePeopleCacheAndPopulateWith, Edition=Grid Edition, Mode=Development, CpuCount=2, SocketCount=2) 2012-06-23 13:29:37.172/10.954 Oracle Coherence GE 3.6.0.4 <D5> (thread=Cluster, member=n/a): Member 1 joined Service Cluster with senior member 1 2012-06-23 13:29:37.188/10.970 Oracle Coherence GE 3.6.0.4 <D5> (thread=Cluster, member=n/a): Member 1 joined Service Management with senior member 1 2012-06-23 13:29:37.188/10.970 Oracle Coherence GE 3.6.0.4 <D5> (thread=Cluster, member=n/a): Member 1 joined Service DistributedCache with senior member 1 2012-06-23 13:29:37.188/10.970 Oracle Coherence GE 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Started cluster Name=cluster:0xC4DB Group{Address=224.3.6.0, Port=36000, TTL=4} MasterMemberSet ( ThisMember=Member(Id=2, Timestamp=2012-06-23 13:29:36.899, Address=192.168.177.133:8090, MachineId=55685, Location=process:244, Role=Oracle) OldestMember=Member(Id=1, Timestamp=2012-06-23 13:29:14.031, Address=192.168.177.133:8088, MachineId=55685, Location=process:1128, Role=CreatePeopleCacheAndPopulateWith) ActualMemberSet=MemberSet(Size=2, BitSetCount=2 Member(Id=1, Timestamp=2012-06-23 13:29:14.031, Address=192.168.177.133:8088, MachineId=55685, Location=process:1128, Role=CreatePeopleCacheAndPopulateWith) Member(Id=2, Timestamp=2012-06-23 13:29:36.899, Address=192.168.177.133:8090, MachineId=55685, Location=process:244, Role=Oracle) ) RecycleMillis=1200000 RecycleSet=MemberSet(Size=0, BitSetCount=0 ) ) TcpRing{Connections=[1]} IpMonitor{AddressListSize=0} 2012-06-23 13:29:37.891/11.673 Oracle Coherence GE 3.6.0.4 <D5> (thread=Invocation:Management, member=2): Service Management joined the cluster with senior service member 1 2012-06-23 13:29:39.203/12.985 Oracle Coherence GE 3.6.0.4 <D5> (thread=DistributedCache, member=2): Service DistributedCache joined the cluster with senior service member 1 2012-06-23 13:29:39.297/13.079 Oracle Coherence GE 3.6.0.4 <D4> (thread=DistributedCache, member=2): Asking member 1 for 128 primary partitions Cache Statistics of 'People': - Total Entries of Cache -----> 3 - Used Memory (Bytes) --------> 883920 - Used Memory (MB) -----------> 0.843 - Object Average Size --------> 294640 I hope that this post could save you some time when calculate the total size of Coherence cache became a requirement for your high scalable system using data grids. See you!

    Read the article

  • CodePlex Daily Summary for Monday, March 07, 2011

    CodePlex Daily Summary for Monday, March 07, 2011Popular ReleasesDotNetAge -a lightweight Mvc jQuery CMS: DotNetAge 2: What is new in DotNetAge 2.0 ? Completely update DJME to DJME2, enhance user experience ,more beautiful and more interactively visit DJME project home to lean more about DJME http://www.dotnetage.com/sites/home/djme.html A new widget engine has came! Faster and easiler. Runtime performance enhanced. SEO enhanced. UI Designer enhanced. A new web resources explorer. Page manager enhanced. BlogML supports added that allows you import/export your blog data to/from dotnetage publishi...Master Data Services Manager: stable 1.0.3: Update 2011-03-07 : bug fixes added external configuration File : configuration.config added TreeView Display of model (still in dev) http://img96.imageshack.us/img96/5067/screenshot073l.jpg added Connection Parameters (username, domain, password, stored encrypted in configuration file) http://img402.imageshack.us/img402/5350/screenshot072qc.jpgSharePoint Content Inventory: Release 1.1: Release 1.1Menu and Context Menu for Silverlight 4.0: Silverlight Menu and Context Menu v2.4 Beta: - Moved the core of the PopupMenu class to the new PopupMenuBase class. - Renamed the MenuTriggerElement class to MenuTriggerRelationship. - Renamed the ApplicationMenus property to MenuTriggers. - Renamed the ImageLeftOpacity property to ImageOpacity. - Renamed the ImageLeftVisibility property to ImageVisibility. - Renamed the ImageLeftMinWidth property to ImageMinWidth. - Renamed the ImagePathForRightMargin property to ImageRightPath. - Renamed the ImageSourceForRightMargin property to Ima...Kooboo CMS: Kooboo CMS 3.0 Beta: Files in this downloadkooboo_CMS.zip: The kooboo application files Content_DBProvider.zip: Additional content database implementation of MSSQL,SQLCE, RavenDB and MongoDB. Default is XML based database. To use them, copy the related dlls into web root bin folder and remove old content provider dlls. Content provider has the name like "Kooboo.CMS.Content.Persistence.SQLServer.dll" View_Engines.zip: Supports of Razor, webform and NVelocity view engine. Copy the dlls into web root bin folder t...ASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.7.2: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager added fullscreen for the popup and popupformIronPython: 2.7 Release Candidate 2: On behalf of the IronPython team, I am pleased to announce IronPython 2.7 Release Candidate 2. The releases contains a few minor bug fixes, including a working webbrowser module. Please see the release notes for 61395 for what was fixed in previous releases.LINQ to Twitter: LINQ to Twitter Beta v2.0.20: Mono 2.8, Silverlight, OAuth, 100% Twitter API coverage, streaming, extensibility via Raw Queries, and added documentation.IIS Tuner: IIS Tuner 1.0: IIS and ASP.NET performance optimization toolMinemapper: Minemapper v0.1.6: Once again supports biomes, thanks to an updated Minecraft Biome Extractor, which added support for the new Minecraft beta v1.3 map format. Updated mcmap to support new biome format.CRM 2011 OData Query Designer: CRM 2011 OData Query Designer: The CRM 2011 OData Query Designer is a Silverlight 4 application that is packaged as a Managed CRM 2011 Solution. This tool allows you to build OData queries by selecting filter criteria, select attributes and order by attributes. The tool also allows you to Execute the query and view the ATOM and JSON data returned. The look and feel of this component will improve and new functionality will be added in the near future so please provide feedback on your experience. Import this solution int...Sandcastle Help File Builder: SHFB v1.9.3.0 Release: This release supports the Sandcastle June 2010 Release (v2.6.10621.1). It includes full support for generating, installing, and removing MS Help Viewer files. This new release is compiled under .NET 4.0, supports Visual Studio 2010 solutions and projects as documentation sources, and adds support for projects targeting the Silverlight Framework. This release uses the Sandcastle Guided Installation package used by Sandcastle Styles. Download and extract to a folder and then run SandcastleI...mytrip.mvc (CMS & e-Commerce): mytrip.mvc 1.0.53.0 beta 2: New SEO Optimisation WEB.mytrip.mvc 1.0.53.0 Web for install hosting System Requirements: NET 4.0, MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) SRC.mytrip.mvc 1.0.53.0 System Requirements: Visual Studio 2010 or Web Deweloper 2010 MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) Connector/Net 6.3.5, MVC3 RTM WARNING For run and debug SRC.mytrip.mvc 1.0.53.0 dow...AutoLoL: AutoLoL v1.6.4: It is now possible to run the clicker anyway when it can't detect the Masteries Window Fixed a critical bug in the open file dialog Removed the resize button Some UI changes 3D camera movement is now more intuitive (Trackball rotation) When an error occurs on the clicker it will attempt to focus AutoLoLYAF.NET (aka Yet Another Forum.NET): v1.9.5.5 RTW: YAF v1.9.5.5 RTM (Date: 3/4/2011 Rev: 4742) Official Discussion Thread here: http://forum.yetanotherforum.net/yaf_postsm47149_v1-9-5-5-RTW--Date-3-4-2011-Rev-4742.aspx Changes in v1.9.5.5 Rev. #4661 - Added "Copy" function to forum administration -- Now instead of having to manually re-enter all the access masks, etc, you can just duplicate an existing forum and modify after the fact. Rev. #4642 - New Setting to Enable/Disable Last Unread posts links Rev. #4641 - Added Arabic Language t...Snippet Designer: Snippet Designer 1.3.1: Snippet Designer 1.3.1 for Visual Studio 2010This is a bug fix release. Change logFixed bug where Snippet Designer would fail if you had the most recent Productivity Power Tools installed Fixed bug where "Export as Snippet" was failing in non-english locales Fixed bug where opening a new .snippet file would fail in non-english localesChiave File Encryption: Chiave 1.0: Final Relase for Chave 1.0 Stable: Application for file encryption and decryption using 512 Bit rijndael encyrption algorithm with simple to use UI. Its written in C# and compiled in .Net version 3.5. It incorporates features of Windows 7 like Jumplists, Taskbar progress and Aero Glass. Now with added support to Windows XP! Change Log from 0.9.2 to 1.0: ==================== Added: > Added Icon Overlay for Windows 7 Taskbar Icon. >Added Thumbnail Toolbar buttons to make the navigation easier...ASP.NET: Sprite and Image Optimization Preview 3: The ASP.NET Sprite and Image Optimization framework is designed to decrease the amount of time required to request and display a page from a web server by performing a variety of optimizations on the page’s images. This is the third preview of the feature and works with ASP.NET Web Forms 4, ASP.NET MVC 3, and ASP.NET Web Pages (Razor) projects. The binaries are also available via NuGet: AspNetSprites-Core AspNetSprites-WebFormsControl AspNetSprites-MvcAndRazorHelper It includes the foll...Network Monitor Open Source Parsers: Microsoft Network Monitor Parsers 3.4.2554: The Network Monitor Parsers packages contain parsers for more than 400 network protocols, including RFC based public protocols and protocols for Microsoft products defined in the Microsoft Open Specifications for Windows and SQL Server. NetworkMonitor_Parsers.msi is the base parser package which defines parsers for commonly used public protocols and protocols for Microsoft Windows. In this release, we have added 4 new protocol parsers and updated 79 existing parsers in the NetworkMonitor_Pa...Image Resizer for Windows: Image Resizer 3 Preview 1: Prepare to have your minds blown. This is the first preview of what will eventually become 39613. There are still a lot of rough edges and plenty of areas still under construction, but for your basic needs, it should be relativly stable. Note: You will need the .NET Framework 4 installed to use this version. Below is a status report of where this release is in terms of the overall goal for version 3. If you're feeling a bit technically ambitious and want to check out some of the features th...New ProjectsAppFactory: Die AppFactory Dient zur Vereinfachung der entwicklung von WPF Anwedungen. Es ist in C# entwickelt.Change the Default Playback Sound Device: ChangePlaybackDevice makes it easier for personal user to change the default playback sound device. You'll no longer have to change the default playback sound device by hand. It's developed in C#. Conectayas: Conectayas is an open source "Connect Four" alike game but transformable to "Tic-Tac-Toe" and to a lot of similar games that uses mouse. Written in DHTML (JavaScript, CSS and HTML). Very configurable. This cross-platform and cross-browser game was tested under BeOS, Linux, *BSD, Windows and others.Diamond: The all in one toolkit for WPF and Silverligth projects.Digital Disk File Format: Digital Disk is a File format that uses a simple key system, it is currently in development. It is written in vb.net, but will be expanded into other languagesdotnetMvcMalll: this is a asp.net mvc mallEasyCache .NET: EasyCache .NET is a simplified API over the ASP.NET Cache object. Its purpose is to offer a more concise syntax for adding and retrieving items from the cache.Eric Fang SharePoint workflow activities: Eric Fang SharePoint workflow activitiesExpert.NET: Expert.NET is an expert system framework for .NET applications. Written in F#, it provides constructs for defining probabilistic rulesets, as well as an inference engine. Expert.NET is ideal for encoding domain knowledge used by troubleshooting applications.GameGolem: The GameGolem is an XNA Casual Gamers portal. The purpose is to create a single ClickOnce deployed "Game Launcher" which exposes simple API for games to keep track of highscores, achivements, etc. GameGolem will become a Kongregate-like XNA-based casual games portal.Hundiyas: Hundiyas is an open source "Battleship" alike game totally written in DHTML (JavaScript, CSS and HTML) that uses mouse. This cross-platform and cross-browser game was tested under BeOS, Linux, *BSD, Windows and others.ISBC: Practicas ISBC 10/11maocaijun.database: databaseNCLI: A simple API for command line argument parsing, written in C#.nEMO: nEMO is a pure C# framework for Evolutionary Multiobjective Optimization.N-tier architecture sample: A sample on how to practically design a system following an n-tier (multitier) architecture in line with the patterns and practices presented by Microsofts Application Architectural Guide 2.0. Focus is on a service application and it´s client applications of various types.PostsByMonth Widget: This is a simple widget for the Graffiti CMS application that allows you to get a monthly list of new posts to the site. It's configurable to allow for the # of posts to display as well as the the format of the month/year header, the title and the individual line entries. This is written in .Net 3.5 with Vb.Net.Puzzle Pal: Smartphone assistant for all your puzzling events.RavenDB Notification: Notification plugin for RavenDB. With this plugin you are able to subscribe to insert and delete notifications from the RavenDB server. Very helpfull if you need to process new documents on the remote clients and you do not like to query DB for new changes.Teamwork by Intrigue Deviation: A feature-rich team collaboration and project management effort built around Scrum methodology with MVC/2.test_flow: test flowTicari Uygulama Paketi: Ticari Uygulama Paketi (TUP), Microsoft Ofis 2010 ürünleri için gelistirilmis eklenti yazilimidir.XpsViewer: XpsVieweryaphan: yaphan cms.

    Read the article

  • CodePlex Daily Summary for Saturday, March 12, 2011

    CodePlex Daily Summary for Saturday, March 12, 2011Popular ReleasesXML Explorer: XML Explorer 4.0.2: Changes in 4.0: This release is built on the Microsoft .NET Framework 4 Client Profile. Changed XSD validation to use the schema specified by the XML documents. Added a VS style Error List, double-clicking an error takes you to the offending node. XPathNavigator schema validation finally gives SourceObject (was fixed in .NET 4). Added Namespaces window and better support for XPath expressions in documents with a default namespace. Added ExpandAll and CollapseAll toolbar buttons (in a...Mobile Device Detection and Redirection: 1.0.0.0: Stable Release 51 Degrees.mobi Foundation has been in beta for some time now and has been used on thousands of websites worldwide. We’re now highly confident in the product and have designated this release as stable. We recommend all users update to this version. New Capabilities MappingsTo improve compatibility with other libraries some new .NET capabilities are now populated with wurfl data: “maximumRenderedPageSize” populated with “max_deck_size” “rendersBreaksAfterWmlAnchor” populated ...ASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.7.3: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager added interactive search for the lookupWPF Inspector: WPF Inspector 0.9.7: New Features in Version 0.9.7 - Support for .NET 3.5 and 4.0 - Multi-inspection of the same process - Property-Filtering for multiple keywords e.g. "Height Width" - Smart Element Selection - Select Controls by clicking CTRL, - Select Template-Parts by clicking CTRL+SHIFT - Possibility to hide the element adorner (over the context menu on the visual tree) - Many bugfixes??????????: All-In-One Code Framework ??? 2011-03-10: http://download.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1codechs&DownloadId=216140 ??,????。??????????All-In-One Code Framework ???,??20?Sample!!????,?????。http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=128165 ASP.NET ??: CSASPNETBingMaps VBASPNETRemoteUploadAndDownload CS/VBASPNETSerializeJsonString CSASPNETIPtoLocation CSASPNETExcelLikeGridView ....... Winform??: FTPDownload FTPUpload MultiThreadedWebDownloader...Rawr: Rawr 4.1.0: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a Release of the WPF version, most of the general issues have been resolved. If you have a problem, please follow the Posting Guidelines and put it into the Issue Tracker. Whe...PHP Manager for IIS: PHP Manager 1.1.2 for IIS 7: This is a localization release of PHP Manager for IIS 7. It contains all the functionality available in 56962 plus a few bug fixes (see change list for more details). Most importantly this release is translated into five languages: German - the translation is provided by Christian Graefe Dutch - the translation is provided by Harrie Verveer Turkish - the translation is provided by Yusuf Oztürk Japanese - the translation is provided by Kenichi Wakasa Russian - the translation is provid...TweetSharp: TweetSharp v2.0.0: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Beta ChangesAdded user streams support Serialization is not attempted for Twitter 5xx errors Fixes based on feedback Third Party Library VersionsHammock v1.2.0: http://hammock.codeplex.com Json.NET 4.0 Release 1: http://json.codeplex.comMicrosoft All-In-One Code Framework - a centralized code sample library: Visual Studio 2008 Code Samples 2011-03-09: Code samples for Visual Studio 2008Office Web.UI: Version 2.4: After having lost all modifications done for 2.3. I finally did it again... Have a look at http://www.officewebui.com/change-log Also, the documentation continues to grow... http://www.officewebui.com/category/kb ThanksmyCollections: Version 1.3: New in version 1.3 : Added Editor management for Books Added Amazon API for Books Us, Fr, De Added Amazon Us, Fr, De for Movies Added The MovieDB for Fr and De Added Author for Books Added Editor and Platform for Games Added Amazon Us, De for Games Added Studio for XXX Added Background for XXX Bug fixing with Softonic API Bug fixing with IMDB UI improvement Removed GraceNote Added Amazon Us,Fr, De for Series Added TVDB Fr and De for Series Added Tracks for Musi...Facebook Graph Toolkit: Facebook Graph Toolkit 1.1: Version 1.1 (8 Mar 2011)new Dialog class for redirecting users to Facebook dialogs new Async publishing methods new Check for Extended Permissions option fixed bug: inappropiate condition of redirecting to login in Api class fixed bug: IframeRedirect method not workingpatterns & practices : Composite Services: Composite Services Guidance - CTP2: Overview The Composite Services guidance (codename Reykjavik) provides best practices and capabilities for applying industry-known SOA design patterns when building robust, connected, service-oriented composite enterprise applications. These capabilities are implemented as a set of reusable components for analytic tracing, service virtualization, metadata centralization and versioning, and policy centralization as well as exception management, included in this release. Changes in this CTP ...Python Tools for Visual Studio: 1.0 Beta 1: Beta 1You can't install IronPython Tools for Visual Studio side-by-side with Python Tools for Visual Studio. A race condition sometimes causes local MPI debugging to miss breakpoints. When MPI jobs on a cluster fail they don’t get cleaned up correctly, which can cause debugging to stall because the associated MPI job is stuck in the queue. The "Threads" view has a race condition which can cause it not to display properly at times. VS2010 shortcuts that are pinned to the taskbar are so...DotNetAge -a lightweight Mvc jQuery CMS: DotNetAge 2: What is new in DotNetAge 2.0 ? Completely update DJME to DJME2, enhance user experience ,more beautiful and more interactively visit DJME project home to lean more about DJME http://www.dotnetage.com/sites/home/djme.html A new widget engine has came! Faster and easiler. Runtime performance enhanced. SEO enhanced. UI Designer enhanced. A new web resources explorer. Page manager enhanced. BlogML supports added that allows you import/export your blog data to/from dotnetage publishi...Kooboo CMS: Kooboo CMS 3.0 Beta: Files in this downloadkooboo_CMS.zip: The kooboo application files Content_DBProvider.zip: Additional content database implementation of MSSQL,SQLCE, RavenDB and MongoDB. Default is XML based database. To use them, copy the related dlls into web root bin folder and remove old content provider dlls. Content provider has the name like "Kooboo.CMS.Content.Persistence.SQLServer.dll" View_Engines.zip: Supports of Razor, webform and NVelocity view engine. Copy the dlls into web root bin folder t...IronPython: 2.7 Release Candidate 2: On behalf of the IronPython team, I am pleased to announce IronPython 2.7 Release Candidate 2. The releases contains a few minor bug fixes, including a working webbrowser module. Please see the release notes for 61395 for what was fixed in previous releases.LINQ to Twitter: LINQ to Twitter Beta v2.0.20: Mono 2.8, Silverlight, OAuth, 100% Twitter API coverage, streaming, extensibility via Raw Queries, and added documentation.Minemapper: Minemapper v0.1.6: Once again supports biomes, thanks to an updated Minecraft Biome Extractor, which added support for the new Minecraft beta v1.3 map format. Updated mcmap to support new biome format.Sandcastle Help File Builder: SHFB v1.9.3.0 Release: This release supports the Sandcastle June 2010 Release (v2.6.10621.1). It includes full support for generating, installing, and removing MS Help Viewer files. This new release is compiled under .NET 4.0, supports Visual Studio 2010 solutions and projects as documentation sources, and adds support for projects targeting the Silverlight Framework. This release uses the Sandcastle Guided Installation package used by Sandcastle Styles. Download and extract to a folder and then run SandcastleI...New Projects8428-3127-6884-4328: No summary providedAngua R.P.G. Engine: The Angua R.P.G. Engine is a C# open-source system for playing turn-based Role-Playing Games over the Internet or TCP/IP based networks. AppServices - SOA for greenhorns: AppServices is a simple service container. It can inject services to any public or private property or field with declared ServiceAttribute. It supports Windows Forms, WPF and ASP.NET. Silverlight is not tested yet, but it should do also. AppServices simplifies service referenceAxHibernate: AxHibernate aims to produce a POCO-mapped style domain-library for Dynamics AX. I.e., an AX business-connector-ignorant ORM domain library for Dynamics Ax.Business localizer, translation: Translation and localization software used in your business. Help to localize resource file on top of .NET Microsoft stack.cloudsync: cloudsyncDaily Deals .Net - Groupon Clone: Daily Deals .Net is a "daily deals" application based on the .net technology. Utilizing MVC 3 to provide a Groupon-like experience. This project needs your support. Please sign up today to help bring a viable daily deals project to the .Net open source world. ENGUILABE.INFO - Descubre un nuevo mundo: codigo reutilizable y aprendibleeuler 40: euler 40euler 42: euler 42gibon: gibbon blog+portfolio+communitiHoley Ship: Battleship project.Improved CAPTCHA Control ASP.NET 2.0 C#: Improved C# port of Jeff Atwood's custom CAPTCHA control (image verification against robots).IslandUnit: IslandUnit helps you isolate the dependencies of your system with an fluent interface that makes easier to produce mocks and stubs with existing frameworks (Moq, NMock, NBuilder, AutoPoco) and put the isolated dependencies in IoC containers, leaving your system highly testable.KidsChores: Helps keep tracks of to do list for kids, allowing them to earn points, allowances, etc.Local Copy SharePoint Items: The "Local Copy SharePoint Items" is a PowerShell (v1) written script to grab all documents out of libraries/lists from a given SharePoint 2007 site. Get yourself exited to run this script as folliowing: .\LCSPI.ps1 -url http://moss2007 -bkploc c:\destination mkfashio: mkfashion MyTest: MyTestNaja, a Cobra IDE: Naja (a.k.a. Cobra IDE) is an Integrated Development Environment for the Cobra programming language (www.cobra-language.com). NDasm: Visual Studio add-in that displays IL code for any managed method in your .Net solution. Helpful for studying .Net platform. For example now you can easily see what happens when you are declaring delegate type or how your cool lambda expression actually looks like. olympicgameslondon: The aim of the Olympicgameslondon project is to provide a platform where sports professionals and athletes can broadcast themselves, individuals/supporters, Londoners, visitors etc can post views, share information and find interesting news about the London 2012 Olympic games.Populate SharePoint Form Fields from QueryString via Javascript: This project makes it possible to pre-fill SharePoint Edit form fields using JavaScript and field values passed via querystring - without server-side deployment.Project Server: Xin chào t?t c? các thành viên c?a team th?c t?p t?t nghi?p c?a công ty Toàn C?u Th?nh. Ðây là server du?c dùng d? giúp d? m?i ngu?i ch?nh s?a project c?a m?t m?t cách d? dàng nh?t trong quá trình làm vi?c nhóm. Support: ngtrongtri@yahoo.com.vnPure Midi: Pure Midi - Handling midi communication in .NET with full sequencing support and arranger like musical styles playing.Relative Time: Render a TimeSpan object as something that's consumable by humans like "about 9 minutes ago" or "yesterday".RPG Character Creation: RPG Character Creation makes it easier to create and maintain characters using the Avalon 2.0 ruleset. It is developed in C#.SearchBox for WPF: Customizable search box for WPF with auto-complete capabilities.Sebro: Sebro is a freelance-like platform for the SEO related market. The main feature of the platform is an automated tracking and statistics gathering of work and strongly formalized criteria of its completion or failure.ShoppingApp2: Project will cover basics of creating ASP.NET application. Aim of project is creating application which will help users, via web, to manage their home budget. Socket Redirection for Terminal Services: Socket Redirection for Terminal Services allows to connect to the Internet on a machine, which does not has access to I-net, using another machine in the network, which has that access. Sort Calculator: it is a c# sort calculator that generates randomly array numbers and implements several sorting algorithms.SpSource 2010: Port of SPSource to work with VS 2010 Tools for SharePoint 2010SQL Script Executer: SQL Executer makes it easy to deploy, execute multiple scripts on SQL Server as you could do in VS 2008. In Visual Studio 2008's Database Project, one had an option to select multiple sql files and Run them on choosen DB. Visual Studio 2010, doesn't come with this functionality. Test By Wire: Test by wire is a unit test framework, which handles automatic setup and orchestration of test-target and dependencies. In addition Test By Wire features an automatic mocking feature, that is interfaces with pure BDD style syntax. Time Manager System: The TMS (Time Manager System) allows you to plan, organize and schedule your daily activities. The TMS is based on the Pomodoro Technique that improves your productivity.Windows Azure Service Instances Auto Scaling: Windows Azure Service Instances Auto Scaling is a way for dynamically scaling-up and scaling-down the instances number of a running hosted service. In this version this management is done based on a time schedule by an Azure Worker Role.

    Read the article

  • Fraud Detection with the SQL Server Suite Part 1

    - by Dejan Sarka
    While working on different fraud detection projects, I developed my own approach to the solution for this problem. In my PASS Summit 2013 session I am introducing this approach. I also wrote a whitepaper on the same topic, which was generously reviewed by my friend Matija Lah. In order to spread this knowledge faster, I am starting a series of blog posts which will at the end make the whole whitepaper. Abstract With the massive usage of credit cards and web applications for banking and payment processing, the number of fraudulent transactions is growing rapidly and on a global scale. Several fraud detection algorithms are available within a variety of different products. In this paper, we focus on using the Microsoft SQL Server suite for this purpose. In addition, we will explain our original approach to solving the problem by introducing a continuous learning procedure. Our preferred type of service is mentoring; it allows us to perform the work and consulting together with transferring the knowledge onto the customer, thus making it possible for a customer to continue to learn independently. This paper is based on practical experience with different projects covering online banking and credit card usage. Introduction A fraud is a criminal or deceptive activity with the intention of achieving financial or some other gain. Fraud can appear in multiple business areas. You can find a detailed overview of the business domains where fraud can take place in Sahin Y., & Duman E. (2011), Detecting Credit Card Fraud by Decision Trees and Support Vector Machines, Proceedings of the International MultiConference of Engineers and Computer Scientists 2011 Vol 1. Hong Kong: IMECS. Dealing with frauds includes fraud prevention and fraud detection. Fraud prevention is a proactive mechanism, which tries to disable frauds by using previous knowledge. Fraud detection is a reactive mechanism with the goal of detecting suspicious behavior when a fraudster surpasses the fraud prevention mechanism. A fraud detection mechanism checks every transaction and assigns a weight in terms of probability between 0 and 1 that represents a score for evaluating whether a transaction is fraudulent or not. A fraud detection mechanism cannot detect frauds with a probability of 100%; therefore, manual transaction checking must also be available. With fraud detection, this manual part can focus on the most suspicious transactions. This way, an unchanged number of supervisors can detect significantly more frauds than could be achieved with traditional methods of selecting which transactions to check, for example with random sampling. There are two principal data mining techniques available both in general data mining as well as in specific fraud detection techniques: supervised or directed and unsupervised or undirected. Supervised techniques or data mining models use previous knowledge. Typically, existing transactions are marked with a flag denoting whether a particular transaction is fraudulent or not. Customers at some point in time do report frauds, and the transactional system should be capable of accepting such a flag. Supervised data mining algorithms try to explain the value of this flag by using different input variables. When the patterns and rules that lead to frauds are learned through the model training process, they can be used for prediction of the fraud flag on new incoming transactions. Unsupervised techniques analyze data without prior knowledge, without the fraud flag; they try to find transactions which do not resemble other transactions, i.e. outliers. In both cases, there should be more frauds in the data set selected for checking by using the data mining knowledge compared to selecting the data set with simpler methods; this is known as the lift of a model. Typically, we compare the lift with random sampling. The supervised methods typically give a much better lift than the unsupervised ones. However, we must use the unsupervised ones when we do not have any previous knowledge. Furthermore, unsupervised methods are useful for controlling whether the supervised models are still efficient. Accuracy of the predictions drops over time. Patterns of credit card usage, for example, change over time. In addition, fraudsters continuously learn as well. Therefore, it is important to check the efficiency of the predictive models with the undirected ones. When the difference between the lift of the supervised models and the lift of the unsupervised models drops, it is time to refine the supervised models. However, the unsupervised models can become obsolete as well. It is also important to measure the overall efficiency of both, supervised and unsupervised models, over time. We can compare the number of predicted frauds with the total number of frauds that include predicted and reported occurrences. For measuring behavior across time, specific analytical databases called data warehouses (DW) and on-line analytical processing (OLAP) systems can be employed. By controlling the supervised models with unsupervised ones and by using an OLAP system or DW reports to control both, a continuous learning infrastructure can be established. There are many difficulties in developing a fraud detection system. As has already been mentioned, fraudsters continuously learn, and the patterns change. The exchange of experiences and ideas can be very limited due to privacy concerns. In addition, both data sets and results might be censored, as the companies generally do not want to publically expose actual fraudulent behaviors. Therefore it can be quite difficult if not impossible to cross-evaluate the models using data from different companies and different business areas. This fact stresses the importance of continuous learning even more. Finally, the number of frauds in the total number of transactions is small, typically much less than 1% of transactions is fraudulent. Some predictive data mining algorithms do not give good results when the target state is represented with a very low frequency. Data preparation techniques like oversampling and undersampling can help overcome the shortcomings of many algorithms. SQL Server suite includes all of the software required to create, deploy any maintain a fraud detection infrastructure. The Database Engine is the relational database management system (RDBMS), which supports all activity needed for data preparation and for data warehouses. SQL Server Analysis Services (SSAS) supports OLAP and data mining (in version 2012, you need to install SSAS in multidimensional and data mining mode; this was the only mode in previous versions of SSAS, while SSAS 2012 also supports the tabular mode, which does not include data mining). Additional products from the suite can be useful as well. SQL Server Integration Services (SSIS) is a tool for developing extract transform–load (ETL) applications. SSIS is typically used for loading a DW, and in addition, it can use SSAS data mining models for building intelligent data flows. SQL Server Reporting Services (SSRS) is useful for presenting the results in a variety of reports. Data Quality Services (DQS) mitigate the occasional data cleansing process by maintaining a knowledge base. Master Data Services is an application that helps companies maintaining a central, authoritative source of their master data, i.e. the most important data to any organization. For an overview of the SQL Server business intelligence (BI) part of the suite that includes Database Engine, SSAS and SSRS, please refer to Veerman E., Lachev T., & Sarka D. (2009). MCTS Self-Paced Training Kit (Exam 70-448): Microsoft® SQL Server® 2008 Business Intelligence Development and Maintenance. MS Press. For an overview of the enterprise information management (EIM) part that includes SSIS, DQS and MDS, please refer to Sarka D., Lah M., & Jerkic G. (2012). Training Kit (Exam 70-463): Implementing a Data Warehouse with Microsoft® SQL Server® 2012. O'Reilly. For details about SSAS data mining, please refer to MacLennan J., Tang Z., & Crivat B. (2009). Data Mining with Microsoft SQL Server 2008. Wiley. SQL Server Data Mining Add-ins for Office, a free download for Office versions 2007, 2010 and 2013, bring the power of data mining to Excel, enabling advanced analytics in Excel. Together with PowerPivot for Excel, which is also freely downloadable and can be used in Excel 2010, is already included in Excel 2013. It brings OLAP functionalities directly into Excel, making it possible for an advanced analyst to build a complete learning infrastructure using a familiar tool. This way, many more people, including employees in subsidiaries, can contribute to the learning process by examining local transactions and quickly identifying new patterns.

    Read the article

  • CodePlex Daily Summary for Saturday, November 19, 2011

    CodePlex Daily Summary for Saturday, November 19, 2011Popular ReleasesWPF Converters: WPF Converters V1.2.0.0: support for enumerations, value types, and reference types in the expression converter's equality operators the expression converter now handles DependencyProperty.UnsetValue as argument values correctly (#4062) StyleCop conformance (more or less)Json.NET: Json.NET 4.0 Release 4: Change - JsonTextReader.Culture is now CultureInfo.InvariantCulture by default Change - KeyValurPairConverter no longer cares about the order of the key and value properties Change - Time zone conversions now use new TimeZoneInfo instead of TimeZone Fix - Fixed boolean values sometimes being capitalized when converting to XML Fix - Fixed error when deserializing ConcurrentDictionary Fix - Fixed serializing some Uris returning the incorrect value Fix - Fixed occasional error when...Media Companion: MC 3.423b Weekly: Ensure .NET 4.0 Full Framework is installed. (Available from http://www.microsoft.com/download/en/details.aspx?id=17718) Ensure the NFO ID fix is applied when transitioning from versions prior to 3.416b. (Details here) Replaced 'Rebuild' with 'Refresh' throughout entire code. Rebuild will now be known as Refresh. mc_com.exe has been fully updated TV Show Resolutions... Resolved issue #206 - having to hit save twice when updating runtime manually Shrunk cache size and lowered loading times f...Windows Azure Toolkit for Social Games: 1.1.1: Version 1.1.1: Updated to use Windows Azure SDK and Tools Version 1.6 Version 1.1.1: Performance improvements Separated Social Gaming Toolkit from Tankster Sample Added Tic-Tac-Toe and Four in a row game Simplified game API Simplified JavaScript Libraries Improved documentationASP.net Awesome Samples (Web-Forms): 1.0 samples: Full Demo VS2008 Very Simple Demo VS2010 (demos for the ASP.net Awesome jQuery Ajax Controls)SharpMap - Geospatial Application Framework for the CLR: SharpMap-0.9-AnyCPU-Trunk-2011.11.17: This is a build of SharpMap from the 0.9 development trunk as per 2011-11-17 For most applications the AnyCPU release is the recommended, but in case you need an x86 build that is included to. For some dataproviders (GDAL/OGR, SqLite, PostGis) you need to also referense the SharpMap.Extensions assembly For SqlServer Spatial you need to reference the SharpMap.SqlServerSpatial assemblySQL Monitor - tracking sql server activities: SQLMon 4.1 alpha 5: 1. added basic schema support 2. added server instance name and process id 3. fixed problem with object search index out of range 4. improved version comparison with previous/next difference navigation 5. remeber main window spliter and object explorer spliter positionAJAX Control Toolkit: November 2011 Release: AJAX Control Toolkit Release Notes - November 2011 Release Version 51116November 2011 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4 - Binary – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 - Binary – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ASP.NET 2.0. The latest version that is compatible with ASP.NET 2.0 can be found h...MVC Controls Toolkit: Mvc Controls Toolkit 1.5.5: Added: Now the DateRanteAttribute accepts complex expressions containing "Now" and "Today" as static minimum and maximum. Menu, MenuFor helpers capable of handling a "currently selected element". The developer can choose between using a standard nested menu based on a standard SimpleMenuItem class or specifying an item template based on a custom class. Added also helpers to build the tree structure containing all data items the menu takes infos from. Improved the pager. Now the developer ...SharpCompress - a fully native C# library for RAR, 7Zip, Zip, Tar, GZip, BZip2: SharpCompress 0.7: Reworked API to be more consistent. See Supported formats table. Added some more helper methods - e.g. OpenEntryStream (RarArchive/RarReader does not support this) Fixed up testsCODE Framework: 4.0.11115.0: Added support for partial views in the WPF framework, as well as a new helper feature that allows hooking commands/actions to all WPF events.Silverlight Toolkit: Windows Phone Toolkit - Nov 2011 (7.1 SDK): This release is coming soon! What's new ListPicker once again works in a ScrollViewer LongListSelector bug fixes around OutOfRange exceptions, wrong ordering of items, grouping issues, and scrolling events. ItemTuple is now refactored to be the public type LongListSelectorItem to provide users better access to the values in selection changed handlers. PerformanceProgressBar binding fix for IsIndeterminate (item 9767 and others) There is no longer a GestureListener dependency with the C...DotNetNuke® Community Edition: 06.01.01: Major Highlights Fixed problem with the core skin object rendering CSS above the other framework inserted files, which caused problems when using core style skin objects Fixed issue with iFrames getting removed when content is saved Fixed issue with the HTML module removing styling and scripts from the content Fixed issue with inserting the link to jquery after the header of the page Security Fixesnone Updated Modules/Providers ModulesHTML version 6.1.0 ProvidersnoneDotNetNuke Performance Settings: 01.00.00: First release of DotNetNuke SQL update queries to set the DNN installation for optimimal performance. Please review and rate this release... (stars are welcome)SCCM Client Actions Tool: SCCM Client Actions Tool v0.8: SCCM Client Actions Tool v0.8 is currently the latest version. It comes with following changes since last version: Added "Wake On LAN" action. WOL.EXE is now included. Added new action "Get all active advertisements" to list all machine based advertisements on remote computers. Added new action "Get all active user advertisements" to list all user based advertisements for logged on users on remote computers. Added config.ini setting "enablePingTest" to control whether ping test is ru...C.B.R. : Comic Book Reader: CBR 0.3: New featuresAdd magnifier size and scale New file info view in the backstage Add dynamic properties on book and settings Sorting and grouping in the explorer with new design Rework on conversion : Images, PDF, Cbr/rar, Cbz/zip, Xps to the destination formats Images, Cbz and XPS ImprovmentsSuppress MainViewModel and ExplorerViewModel dependencies Add view notifications and Messages from MVVM Light for ViewModel=>View notifications Make thread better on open catalog, no more ihm freeze, less t...Desktop Google Reader: 1.4.2: This release remove the like and the broadcast buttons as Google Reader stopped supporting them (no, we don't like this decission...) Additionally and to have at least a small plus: the login window now automaitcally logs you in if you stored username and passwort (no more extra click needed) Finally added WebKit .NET to the about window and removed Awesomium MD5-Hash: 5fccf25a2fb4fecc1dc77ebabc8d3897 SHA-Hash: d44ff788b123bd33596ad1a75f3b9fa74a862fdbFluent Validation for .NET: 3.2: Changes since 3.1: Fixed issue #7084 (NotEmptyValidator does not work with EntityCollection<T>) Fixed issue #7087 (AbstractValidator.Custom ignores RuleSets and always runs) Removed support for WP7 for now as it doesn't support co/contravariance without crashing.Rawr: Rawr 4.2.7: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr AddonWe now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including bag and bank items) like Char...VidCoder: 1.2.2: Updated Handbrake core to svn 4344. Fixed the 6-channel discrete mixdown option not appearing for AAC encoders. Added handling for possible exceptions when copying to the clipboard, added retries and message when it fails. Fixed issue with audio bitrate UI not appearing sometimes when switching audio encoders. Added extra checks to protect against reported crashes. Added code to upgrade encoding profiles on old queued items.New ProjectsAnimateX: silverlight animate basic library C#code build storyboard object basic object build animate engine for silverlight 4.0Api diendandaihoc.vn: api news diendandaihoc.vnAviaCode Interview Project: This is an extension to the Apunta Notas project. It was created for interview purposes.BitTorrentSharp: BitTorrent Sharp is an open source bit torrent protocol and server/client implementation.BizTalk Archiving - SQL and File: BizTalk Message Archiving - it's a pipeline component that can be used for archiving incoming/outgoing message from any adapters. It provides an option to save the message to either file (local, shared, network) or in SQL Server. bpatch: Simple byte patch utilityCourseManager: Course Manager is an Application for course,instructor and students control.dise operating system: the name is get from Divine State. Our goal is to create a real-time, efficient, stable operating system.Electrum: Simple Windows Phone 7 toolkitendrocode: Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat. Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscipit lobortis nisl ut aliquip ex ea commodo consequat. Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molestie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio dignissim qui blandit praesent luptatum zzril deleni...FinanceTool: FinanceTool makes a visual representation of your spending during a period of time. The Winforms application works on export files from the Dutch ING bank and the SNS bank. It's developed in C# based on .NET 4.0.Kinect Spots: Kinect Spots is a small little app that displays a stylized bubble image based on the input from the Microsoft Kinect's camera.MEF practises: Project Nebula: ASP.NET MVC 3 with Full MEF architectureNesoi 2D game engine: This is a 2D game engine witten for XNA to to use a component based architecture.nullllllllllllll: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~PolytechMango: Mobile application for DI courses Polytech'Tours developed in Windows Phone 7.5 platform Authors : ARKHIS - AZIRAR - FOFANA - JEBARI PowerPartsforSharePoint.CrossSiteViewer: CrossSite Viewer WebPart allows you to view SharePoint list or Document library from other sites without coding. Simply drag and drop the web part onto the page, select the source site collection , web , view and you are done. www.PowerPartsForSharePoint.ComQuix Utilities for SharePoint: Over the past decade of working with SharePoint, I've had to build many quick utilities for one purpose or another. It thus came to pass that it made sense to unify all these utilities together into a single project that I could share with my fellow geeks. The Quix Utilities for SharePoint tool set is a collection of utilities that perform a wide variety of tasks in SharePoint and SharePoint servers.ReactiveMVVM: ReactiveMVVM is MVVM patter, it impovert with Microsoft Reactive Extensions. Can used in silverlight, WPF and WP7. ReactiveMVVM makes it easier for you to develop multithreading program in Silverlight, WPF and WP7 project. To be good work with it, you need know Rx framework. Regatta: Regatta is a Window Phone application for sailboat racing. Its mission is to make it easier for people to participate in sailboat racing. The underlying technology is C# and WP7. Anyone with an interest in sailing and/or Windows Phone technology is welcome to contribute.SF for Windows Phone: This is an C# app for Windows Phone 7 that utilizes API's from www.sf.se to display new movies, closest bio etc.SimManning: SimManning is a C#.NET library containing a discrete-event simulation engine dedicated to manning / staffing, especially for domains involving a succession of phases. An example of basic domain is provided, but the idea is for users to implement their own domain. This originates from a research group of DTU, the Technical University of Denmark.Simple Authentication Toolkit: Simple Authentication ToolkitSparkline Generator: Generate symbols sparklineSSIS Package Configuration Editor: This utility identifies package configuration paths that are not valid and enables you to correct the paths without having to open the package in Business Intelligence Development Studio (BIDS).StreamingSoundtracks.com: My first Windows Phone 7.1 application. 1. Now Playing 2. View Queue 3. View Chat Message 4. Send Chat MessageTravellingEntrepreneur: A simple application for finding Green Government Opportunities for Small Businesses. The application based on your current location retrieves all the government programs for the state of location and then allows you to share your favorite program on Facebook. (US only, WP7).wp7msu: For Windows Phone Programming in MSU.

    Read the article

  • CodePlex Daily Summary for Wednesday, April 11, 2012

    CodePlex Daily Summary for Wednesday, April 11, 2012Popular ReleasesCommand-Line Database Builder: 1.0.2012.0411: Utility now supports arbitrary key:value pairs on the command-line for performing replacements in the .pp.sql files. Removed the usage of '-' to prefix key:value arguments. AspNetAssemblyPath is no longer a known key:value pair but can still be used because the tool now supports arbitrary key:value pairs for replacements. This was provided previously to support setting up ASP.NET Membership and Roles in a database. I've added a .pp.sql file to the Examples archive that demonstrates this usage.Supporting Guidance and Whitepapers: v1 - Team Foundation Service Whitepapers: Welcome to the BETA release of the Team Foundation Service Whitepapers preview As this is a BETA release and the quality bar for the final Release has not been achieved, we value your candid feedback and recommend that you do not use or deploy these BETA artifacts in a production environment. Quality-Bar Details Documentation has been reviewed by Visual Studio ALM Rangers Documentation has been through an independent technical review All critical bugs have been resolved Known Issue...Scrum Task Board Card Creator: TaskCardCreator 3.2.0.0: What's New: New report template added: Microsoft Visual Studio Scrum 1.0 Detailed Report Supported Templates: Microsoft Visual Studio Scrum 1.0 MSF for Agile Software Development v5.0Microsoft .NET Gadgeteer: .NET Gadgeteer Core 2.42.550 (BETA): Microsoft .NET Gadgeteer Core RELEASE NOTES Version 2.42.550 11 April 2012 BETA VERSION WARNING: This is a beta version! Please note: - API changes may be made before the next version (2.42.600) - The designer will not show modules/mainboards for NETMF 4.2 until you get upgraded libraries from the module/mainboard vendors - Install NETMF 4.2 (see link below) to use the new features of this release That warning aside, this version should continue to sup...DISM GUI: DISM GUI 3.1.1: Fixes - Fixed a bug in the Delete Driver function - The Index field is not auto populated with the number 1LINQ to Twitter: LINQ to Twitter Beta v2.0.24: Supports .NET 3.5, .NET 4.0, Silverlight 4.0, Windows Phone 7.1, and Client Profile. 100% Twitter API coverage. Also available via NuGet.Kendo UI ASP.NET Sample Applications: Sample Applications (2012-04-11): Sample application(s) demonstrating the use of Kendo UI in ASP.NET applications.Json.NET: Json.NET 4.5 Release 2: New feature - Added support for the SerializableAttribute and serializing a type's internal fields New feature - Added MaxDepth to JsonReader/JsonSerializer/JsonSerializerSettings New feature - Added support for ignoring properties with the NonSerializableAttribute Fix - Fixed deserializing a null string throwing a NullReferenceException Fix - Fixed JsonTextReader reading from a slow stream Fix - Fixed CultureInfo not being overridden on JsonSerializerProxy Fix - Fixed full trust ...SCCM Client Actions Tool: SCCM Client Actions Tool v1.12: SCCM Client Actions Tool v1.12 is the latest version. It comes with following changes since last version: Improved WMI date conversion to be aware of timezone differences and DST. Fixed new version check. The tool is downloadable as a ZIP file that contains four files: ClientActionsTool.hta – The tool itself. Cmdkey.exe – command line tool for managing cached credentials. This is needed for alternate credentials feature when running the HTA on Windows XP. Cmdkey.exe is natively availab...Dual Browsing: Dual Browser: Please note the following: I setup the address bar temporarily to only accepts http:// .com addresses. Just type in the name of the website excluding: http://, www., and .com; (Ex: for www.youtube.com just type: youtube then click OK). The page splitter can be grabbed by holding down your left mouse button and move left or right. By right clicking on the page background, you can choose to refresh, go back a page and so on. Demo video: http://youtu.be/L7NTFVM3JUYMultiwfn: Multiwfn 2.3.3: Multiwfn 2.3.3Liberty: v3.2.0.1 Release 9th April 2012: Change Log-Fixed -Reach Fixed a bug where the object editor did not work on non-English operating systemsPath Copy Copy: 10.1: This release addresses the following work items: 11357 11358 11359 This release is a recommended upgrade, especially for users who didn't install the 10.0.1 version.ExtAspNet: ExtAspNet v3.1.3: ExtAspNet - ?? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ?????????? ExtAspNet ????? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ??????????。 ExtAspNet ??????? JavaScript,?? CSS,?? UpdatePanel,?? ViewState,?? WebServices ???????。 ??????: IE 7.0, Firefox 3.6, Chrome 3.0, Opera 10.5, Safari 3.0+ ????:Apache License 2.0 (Apache) ??:http://extasp.net/ ??:http://bbs.extasp.net/ ??:http://extaspnet.codeplex.com/ ??:http://sanshi.cnblogs.com/ ????: +2012-04-08 v3.1.3 -??Language="zh_TW"?JS???BUG(??)。 +?D...Coding4Fun Tools: Coding4Fun.Phone.Toolkit v1.5.5: New Controls ChatBubble ChatBubbleTextBox OpacityToggleButton New Stuff TimeSpan languages added: RU, SK, CS Expose the physics math from TimeSpanPicker Image Stretch now on buttons Bug Fixes Layout fix so RoundToggleButton and RoundButton are exactly the same Fix for ColorPicker when set via code behind ToastPrompt bug fix with OnNavigatedTo Toast now adjusts its layout if the SIP is up Fixed some issues with Expression Blend supportHarness - Internet Explorer Automation: Harness 2.0.3: support the operation fo frameset, frame and iframe Add commands SwitchFrame GetUrl GoBack GoForward Refresh SetTimeout GetTimeout Rename commands GetActiveWindow to GetActiveBrowser SetActiveWindow to SetActiveBrowser FindWindowAll to FindBrowser NewWindow to NewBrowser GetMajorVersion to GetVersionBetter Explorer: Better Explorer 2.0.0.861 Alpha: - fixed new folder button operation not work well in some situations - removed some unnecessary code like subclassing that is not needed anymore - Added option to make Better Exlorer default (at least for WIN+E operations) - Added option to enable file operation replacements (like Terracopy) to work with Better Explorer - Added some basic usability to "Share" button - Other fixesLightFarsiDictionary - ??????? ??? ?????/???????: LightFarsiDictionary - v1: LightFarsiDictionary - v1WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.3: Version: 2.5.0.3 (Milestone 3): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete [O] WAF: Mark the StringBuilderExtensions class as obsolete because the AppendInNewLine method can be replaced with string.Jo...ClosedXML - The easy way to OpenXML: ClosedXML 0.65.2: Aside from many bug fixes we now have Conditional Formatting The conditional formatting was sponsored by http://www.bewing.nl (big thanks) New on v0.65.1 Fixed issue when loading conditional formatting with default values for icon sets New on v0.65.2 Fixed issue loading conditional formatting Improved inserts performanceNew Projects0x10c Tools: Tools for the 0x10c-CPU: Assembler, emulator and (maybe in the future) a small compiler. Just for fun and exercise.AzureWiki: AzureWiki is the Wiki developed using Windows Azure platform which would be similar to dotnetwikiCommand-Line Database Builder: A command-line tool for interacting with a DBMS command-line interface (e.g., sqlcmd.exe) to execute a sequential list of SQL scripts against the DBMS. Tool allows for expression replacement in the SQL scripts during execution.copydata: The CopyData command-line utility enables you to easily transfer sets of data from an Oracle or SQL server data source directly to a target SQL Server database. It is developed in C#.DinoDoc: The little friendly batch-upload tool designed for SharePoint Server and Windows SharePoint Services, enabling you to easily upload multiple files and folders with a single click! For more information about DinoDoc and about SharePoint development: http://spdino.wordpress.comDiscovery House: This is a project demonstrating a green home.DocShare: DocShare illustrates the CQRS pattern on Windows Azure and also uses MVC4 Web API. DocShare uses two web roles, one for queries (reads) and one for command (writes). Each has a UI and a Web API service.EnderTecLauncher: EnderTecLauncherEntityFilter: This library provides a way to store filtering metadata, and reassemble it into dynamic lambda expressions. It allows for groups of filters to be created. Two implementations of IFilterRepository are in development:Database and XML. It's developed in C# for EntityFramework 4.1 and above.Epi Info™ - Web Analysis & Visualization: Epi Info™ is a public domain suite of software tools designed for the global community of public health practitioners and researchers. It provides for easy data entry form and database construction, a customized data entry experience, and data analyses with epidemiologic statistic Epi Info™ Web Analytics & Visualization is an open source project of the popular Epi Info™ suite of tools. The web product can be deployed as an intranet application and will provide analytical and visualization ...fOrganiz: This application allows you to automatically organize by date in specific subdirectories your picturesforwork: forworkGeneric Language - Mobile & Telephony Technologies: Genlang Mobile and Telephony Technologies, a complete application development platform for all platforms, Windows Mobile, Windows Desktop, Web, Apple, Android, BlackBerry.gindex: Graph has become increasingly important in modelling complicated structures and schemaless data such as proteins,chemical compounds, and XML documents. Given a graph query, it is desirable to retrieve graphs quickly from a large database via graph-based indices.Hijri Date SkinObject: Hijri date skin object for dotnetnuke copy to admin/skins use it in your skin file HorseRaces: Exercise inspired on example found in book "Designing for scalability with Microsoft Windows DNA" by Sten Sundblad and Per SundbladHotelMS: HotelManageSystemhtml5lmth: testjRulee: The jRulee javascript toolkit libraryKrishaTool: oloLegSec: LegSec is an small command line application for collating licence information based on that provided in Nuget packages. Modwind Domain Info: The program determines country of origin for top-level domains and purpose for international ones.MySCM Outlook Addin: This is another tool for SCM/TFS team. Use this add-in to create, update, refresh TFS work items from your Outlook emails. Not a substitution, but this little tool can help you to track your various work in TFS while educating and establishing the processes and policies.neptouni: This software can be used to convert nepali ttf text to the unicode characters.Northwind SSDT: An SSDT project for the Northwind database. This will enable you to deploy Northwind wherever you like. Note that to allow for hosting in a SQL Azure database that is used to host objects for other applications all the Northwind objects have been moved into a schema called [Northwind]Optional: Optional is a library to create options and commands from command-line arguments. It uses Convention over Configuration to get out of your way. Attributes can be used to set properties which differ from the convention.pbdevnpro1: pbdevnpro1,no1Projet LIF7 Snake: Projet LIF7 SnakePurpleStoat: A modular, extensible Silverlight application shell using Prism, Unity and the Enterprise Library, and written in C#. It includes WCF services which provide AuthZ and logging services to the shell, which are also available to the modules.Sharepoint 2010 Weather WebPart using Azure Data Market Met Office Feed: Sharepoint 2010 WebPart that displays a 5 day weather forecast for a given location. The weather data is retrieved from the Met Office feed hosted on the Windows Azure Data Market. This is a free data feed that provides weather data for the UK only.Silverlight Layouts: Silverlight Layouts is a project for controls that behave as content placeholders with pre-defined GUI layout for some of common scenarios: - frozen headers, - frozen columns, - cyrcle layouts etc.Snom Phone .NET Library: .NET Automation library for the snom IP phones. Provides simple class library to interact with you snom phone: - Press any key on the phone. - Dial numbers. - Answer or hang up call. - Mute and un-mute. - Hold and un-hold a call. - Navigate through a routing phone system using dial tone. - Get events on incoming or outgoing calls, as well as other events. - And more...Substrate Windows 8 XAML Framework: Framework for writing Windows 8 applications in XAMLTiger Converters: Tiger is a small languaje based on expressions, so it's perfect for writing the body of a WPF/SL converter.Time manager by bozheville: Time manager by bozhevilleUmbraco 501 on Windows Azure (with Dynamic Deploy): This project is configured to run Umbraco 5.0.1 on Windows Azure via the Dynamic Deploy platform. For more information on Dynamic Deploy visit http://www.dynamicdeploy.com Dynamic Deploy is a cloud deployment platform from where you can deploy applications directly to cloud platforms (like Windows Azure). UnitPrice: This is unit priceWebmedia: this is my webmedia projectWindows 8 Metro RSS Reader: A RSS Reader metro app for Windows 8 written in C# and XAML based on the sample Grid templateWindows Phone UPnP: The basics of a UPnP network stack for Windows Phone, based on a blog post originally. Written in C#, also requires the Async CTP. Includes device discovery via SSDP and method invocation.WinRT XAML Toolkit: A set of controls, extensions and helper classes for Windows Runtime XAML applicationsWmiGuru: WmiGuru is a lightweight F# library for WMI operations such as getting instances, creating instance, and querying associated instances.????: ???? ??.net mvc3??。??jquery+html5????。?????: openwebsite

    Read the article

  • CodePlex Daily Summary for Sunday, June 05, 2011

    CodePlex Daily Summary for Sunday, June 05, 2011Popular ReleasesSizeOnDisk: 1.0.8.1: Toolbar: Show total of selected folder and hard drive usage of the entire disk Fix: Command binding do not work Execution State Toolbar state by selection Progress task barEnhSim: EnhSim 2.4.6 BETA: 2.4.6 BETAThis release supports WoW patch 4.1 at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Added in the proper...patterns & practices: Project Silk: Project Silk Community Drop 10 - June 3, 2011: Changes from previous drop: Many code changes: please see the readme.mht for details. New "Application Notifications" chapter. Updated "Server-Side Implementation" chapter. Guidance Chapters Ready for Review The Word documents for the chapters are included with the source code in addition to the CHM to help you provide feedback. The PDF is provided as a separate download for your convenience. Installation Overview To install and run the reference implementation, you must perform the fol...Claims Based Identity & Access Control Guide: Release Candidate: Highlights of this release This is the release candidate drop of the new "Claims Identity Guide" edition. In this release you will find: All code samples, including all ACS v2: ACS as a Federation Provider - Showing authentication with LiveID, Google, etc. ACS as a FP with Multiple Business Partners. ACS and REST endpoints. Using a WP7 client with REST endpoints. All ACS specific chapters. Two new chapters on SharePoint (SSO and Federation) All revised v1 chapters We are now ...Media Companion: MC 3.404b Weekly: Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. Refer to the documentation on this site for the Installation & Setup Guide Important! *** Due to an issue where the date added & the full genre information was not being read into the Movie cache, it is highly recommended that you perform a Rebuild Movies when first running this latest version. This will read in the information from the nfo's & repopulate the cache used by MC during operation. Fi...Terraria Map Generator: TerrariaMapTool 1.0.0.4 Beta: 1) Fixed the generated map.html file so that the file:/// is included in the base path. 2) Added the ability to use parallelization during generation. This will cause the program to use as many threads as there are physical cores. 3) Fixed some background overdraw.DotRas: DotRas v1.2 (Version 1.2.4168): This release includes compiled (and signed) versions of the binaries, PDBs, CHM help documentation, along with both C# and VB.NET examples. Please don't forget to rate the release! If you find a bug, please open a work item and give as much description as possible. Stack traces, which operating system(s) you're targeting, and build type is also very helpful for bug hunting. If you find something you believe to be a bug but are not sure, create a new discussion on the discussions board. Thank...Caliburn Micro: WPF, Silverlight and WP7 made easy.: Caliburn.Micro v1.1 RTW: Download ContentsDebug and Release Assemblies Samples Changes.txt License.txt Release Highlights For WP7A new Tombstoning API based on ideas from Fluent NHibernate A new Launcher/Chooser API Strongly typed Navigation SimpleContainer included The full phone lifecycle is made easy to work with ie. we figure out whether your app is actually Resurrecting or just Continuing for you For WPFSupport for the Client Profile Better support for WinForms integration All PlatformsA power...VidCoder: 0.9.1: Added color coding to the Log window. Errors are highlighted in red, HandBrake logs are in black and VidCoder logs are in dark blue. Moved enqueue button to the right with the other control buttons. Added logic to report failures when errors are logged during the encode or when the encode finishes prematurely. Added Copy button to Log window. Adjusted audio track selection box to always show the full track name. Changed encode job progress bar to also be colored yellow when the enco...AutoLoL: AutoLoL v2.0.1: - Fixed a small bug in Auto Login - Fixed the updaterEPPlus-Create advanced Excel 2007 spreadsheets on the server: EPPlus 2.9.0.1: EPPlus-Create advanced Excel 2007 spreadsheets on the server This version has been updated to .Net Framework 3.5 New Features Data Validation. PivotTables (Basic functionalliy...) Support for worksheet data sources. Column, Row, Page and Data fields. Date and Numeric grouping Build in styles. ...and more And some minor new features... Ranges Text-Property|Get the formated value AutofitColumns-method to set the column width from the content of the range LoadFromCollection-metho...jQuery ASP.Net MVC Controls: Version 1.4.0.0: Version 1.4.0.0 contains the following additions: Upgraded to MVC 3.0 Upgraded to jQuery 1.6.1 (Though the project supports all jQuery version from 1.4.x onwards) Upgraded to jqGrid 3.8 Better Razor View-Engine support Better Pager support, includes support for custom pagers Added jqGrid toolbar buttons support Search module refactored, with full suport for multiple filters and ordering And Code cleanup, bug-fixes and better controller configuration support.Nearforums - ASP.NET MVC forum engine: Nearforums v6.0: Version 6.0 of Nearforums, the ASP.NET MVC Forum Engine, containing new features: Authentication using Membership Provider for SQL Server and MySql Spam prevention: Flood Control Moderation: Flag messages Content management: Pages: Create pages (about us/contact/texts) through web administration Allow nearforums to run as an IIS subapp Migrated Facebook Connect to OAuth 2.0 Visit the project Roadmap for more details.NetOffice - The easiest way to use Office in .NET: NetOffice Release 0.8b: Changes: - fix critical issue 15922(AccessViolationException) once and for all ...update is strongly recommended Known Issues: - some addin ribbon examples has a COM Register function with missing codebase entry(win32 registry) ...the problem is only affected to c# examples. fixed in latest source code. NetOffice\ReleaseTags\NetOffice Release 0.8.rar Includes: - Runtime Binaries and Source Code for .NET Framework:......v2.0, v3.0, v3.5, v4.0 - Tutorials in C# and VB.Net:...................Serviio for Windows Home Server: Beta Release 0.5.2.0: Ready for widespread beta. Synchronized build number to Serviio version to avoid confusion.AcDown????? - Anime&Comic Downloader: AcDown????? v3.0 Beta4: ??AcDown?????????????,??????????????,????、????。?????Acfun????? ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??v3.0 Beta4 2011-5-31?? ???Bilibili.us????? ???? ?? ???"????" ???Bilibili.us??? ??????? ?? ??????? ?? ???????? ?? ?? ???Bilibili.us?????(??????????????????) ??????(6.cn)?????(????) ?? ?????Acfun?????????? ?????????????? ???QQ???????? ????????????Discussion...CodeCopy Auto Code Converter: Code Copy v0.1: Full add-in, setup project source code and setup fileWordpress Theme 'Windows Metro': v1.00.0017: v1.00.0017 Dies ist eine Vorab-Version aus der Entwickler-Phase. This is a preview version from the development phase.TerrariViewer: TerrariViewer v2.4.1: Added Piggy Bank editor and fixed some minor bugs.Kooboo CMS: Kooboo CMS 3.02: Updated the Kooboo_CMS.zip at 2011-06-02 11:44 GMT 1.Fixed: Adding data rule issue on page. 2.Fixed: Caching issue for higher performance. Updated the Kooboo_CMS.zip at 2011-06-01 10:00 GMT 1. Fixed the published package throwed a compile error under WebSite mode. 2. Fixed the ContentHelper.NewMediaFolderObject return TextFolder issue. 3. Shorten the name of ContentHelper API. NewMediaFolderObject=>MediaFolder, NewTextFolderObject=> TextFolder, NewSchemaObject=>Schema. Also update the C...New ProjectsbuddyAgent: buddyAgent is a Classified PortalDataReaderMapping: use linq expression make mapping sqldatareader to DTO model easyForeign eXchange Center: Foreign eXchange CenterGoogleAuthCLONE: Use this like the Google Authenticator you may have on your Android device to interact with Google's Two Factor Authentication scheme. This allows you to log into Google without having to whip out your phone. Stores account information securely!Grammar and Spell Checking Plugin for Windows Live Writer: This project is a spelling and grammar checker plugin for Windows Live Writer. It uses the "After the Deadline" web service to provide advanced grammar checking abilities.GUNDEMAS: Gundemas.comInstantMods: LolLig4 => Análise e Desenvolvimento de Sistemas: Beta teste App_SERVERLight Time Sheet: Silverlight Time SheetMort8088s XmlHelper: XmlHelper makes it easier for developers to create and read data in XML documents by condensing common tasks into a simple static helper class. XmlHelper's developed in c#. Feel free to suggest new methods or better ways to achieve the existing methods.NeuroForecast: This project is designed to study neural network technologies on the example of predicting currency quotes.Orchard Forums: a forums moduleRules Engine: Rules Engine is a C# project that makes it easier for developers to define business rules on domain objects without coupling the domain object to the business rule. The rules engine supports cross-field validation and conditional validation. Rules are interface-based and are easily extensible. Rules can be added using a fluent interface.Super Byte: Super Byte makes it easier for everyone to use your computer .You'll no longer have to strugle. We are always looking for new devlopers. It's developed in C#. Need a fast and reliable "Operating System"? Well SuperByte can give that to you. We have spent many hours of work to bring this project to you. Enjoy! Taygeta - The video shadering library: This library allows rendering YUV and RGB pixel formats using Direct3D 9 runtime. You can also apply shaders and add text, bitmap and line overlays.Test Driver Program: Testing driver programTowerDefense: ????,??ogre??YAPE: Yet Another Pacman Emulator.

    Read the article

< Previous Page | 416 417 418 419 420 421 422 423 424 425 426 427  | Next Page >