Search Results

Search found 7281 results on 292 pages for 'quality attributes'.

Page 3/292 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Textures quality issues with Libgdx

    - by user1876708
    I have drawn several vector objets and characters ( in Adobe Illustrator ) for my game on Android. They are all scalable at any size without any quality losses ( of course it's vector ^^ ). I have tried to simulate my gameboard directly on Illustrator just before setting my assests on libdgx to implement them in my game. I set all the objects at the good size, so that they fit perfectly on my XHDPI device I am running my test on. As you can see it works great ( for me at least ^^ ), the PNG quality is good for me, as expected ! So I have edited all my PNG at this size, set my assets on libgdx and build my game apk. And here is a screenshot of my gameboard ( don't pay attention at the differences of placing and objects, but check at the objets presents on both screenshot ). As you can see, I have a loss of my PNG quality in the game. It can be seen clealry on the hedgehog PNG, but also ( but not as obvious ) on the mushroom ( check at the outline ) and the hole PNG. If you really pay attention, on every objects, you can see pixels that are not visible on my first screenshot. And I just can't figure out why this is happening Oo If you have any ideas, you are very welcome ! Thanks. PS : You can check more clearly the 2 gameboard on this two links ( look at them at 100%, display at high resolution ) : Good quality link, from Illustrator Poor quality link, from the game Second phase of tests : We display an object ( the hedgehog ) on our main menu screen to see how it looks like. The things is that it looks like he is suppose to, which means, high quality with no pixels. The hedgehog PNG is coming from an atlas : layer.addActor(hedgehog); No loss of quality with this method So we think the problem is comming from the method we are using to display it on our gameboard : blocks[9][3] = new Block(TextureUtils.hedgehog, new Vector2(9, 3)); the block is getting the size from the vector we are associating to it, but we have a loss of quality with this method.

    Read the article

  • In which fields does quality of the software product matter as much as the completion time?

    - by Nav
    Someone told me that if the software product meets clients expectations, it is good quality. But I've worked with Interaction Designers (the same kind of people who made Gmail's interface and usability so cool!), and I've loved working with them because even though they came up with hundreds of changes in requirements, and emphasised on many many subtle details, when the software was complete, I could look at the product and say WOW! The current place I work, the only thing that matters is completing the project on time. As long as it works and as long as the client says it's ok, nobody bothers to improve it. I'm not talking about gold-plating, but I believe that for a programmer to enjoy his (well, maybe her too ;) ) job, they should be able to proudly say that "Hey, I made that software" and that comes only when the product is of good quality. Apart from your opinions on this, I'd also like to know which fields (Eg. Aerospace, Finance etc.) could I find companies (or you could mention the company name) where the quality of a product is as important as completing the project on time?

    Read the article

  • SQL SERVER – Configuring Interactive Cleansing Suggestion Min Score for Suggestions in Data Quality Services (DQS) – Sensitivity of Suggestion

    - by pinaldave
    Earlier I talked about what kind of questions, I do not like when I get asked. Today we will go over the question which I like when I get asked the same. One of the reader practices various steps in my earlier blog post Step by Step Guide to Beginning Data Quality Services in SQL Server 2012 – Introduction to DQS. While reading the blog post he noticed that Data Quality Services is not providing very helpful suggestions. He wrote an email to me about it. Let us go over his email. “Pinal, I noticed in one of your images that DQS is not providing very helpful suggestions. First of all DQS should be able to make intelligent guesses and make the necessary correction by itself. If it cannot do the same, in that case, it should give us intelligent suggestions but in the image included here, I see the suggestions are not there as well. Why is it so? Would you please tell me how to increase the numbers of suggestion? I do understand this may not be preferable solution in many case but all the business cases go on it depends. There are cases when the high sensitivity required and there are cases when higher sensitivities are not required. I would like to seek your help here. –Sriram MD” This is indeed a great question. I see that Sriram understands that every system is different and every application has a different need. I will not have to tell him this most important concept. The question is about how to change the sensitivity of suggestions for correction in DQS. Well, this option is available under the configuration tab in the DQS client. Once you click on Configuration you will see the following screen. Click the Tab of General Settings. You will see the section of Interactive Cleansing. Under this second there is the first option of “Min score for suggestions”. As this is set to 0.7 every suggestion which matches 0.7 probabilities or higher probability are displayed under the suggestion tab. You can see in the following image that there is no suggestion as the min score for suggestions is set to 0.7 and there is no record which qualifies to that much confidence. Now let us change the value of Min Score for suggestion to 0.5. The lower value increased the confidence of DQS to give further suggestion to values which are over 0.5. However, in our case the suggestions which it provides are also accurate. This may not be true for your sample. Every sample is different so you should manually review it before approving them. I guess, this is a simple blog post to demonstrate how to change the confidence value for the suggestions which Data Quality Services provides. Use this feature with care and always tune it according to your datasets and record diversity. Reference: Pinal Dave (http://blog.SQLAuthority.com)       Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Data Quality Services, DQS

    Read the article

  • Migrating test cases & defects from Quality Center to TFS 2008/2010

    - by stackoverflowuser
    Tool that can be used to migrate (or even better..synchronize) test cases and bugs between: TFS 2008 and Quality Center 9.2 (or later) TFS 2010 and Quality Center 9.2 (or later) I am aware of the following tools: Test Case Migrator (Excel/MHT) Tool TFS Bug Item Synchronizer 2.2 for Quality Center Also shai raiten mentions on his blog about QC 2 Team System 2010 migration tool that he has been working on and its done. But could not find any link for downloading the tool. http://blogs.microsoft.co.il/blogs/shair/archive/2009/12/31/quality-center-migration-to-team-system-2010-done.aspx Before jumping on coding with TFS SDK and QC components to come up with my own tool I need some inputs from the stackoverflow community.

    Read the article

  • Ensuring quality of your software and code

    - by Filip Ekberg
    When I usually write code I follow some guidelines to ensure that my code has a certain standard and I as any other developer try to ensure that my code and software is of quality. Try to focus on the programming and not the understanding of the domain or any other pre-programming steps. These are the following steps I live by: Writing unit tests Make it fail ( no code ) Make it Work ( working code ) Analysing abstraction Extracting methods Exteract interfaces Refactoring In addition to the above which is a part of refactoring, I also try to refactor the code with good tools such as ReSharper, CodeRush or others. The question; What is the next step? Commenting the code is trivial and shouldn't even have to be mentioned, but updated comments and xml-comments where it's needed / everywhere is something that I try to have. But all the above helps he ensure that other developers might understand my code, that the code has some sort of quality and follows naming standards. It does however not ensure any product quality. I am looking for tools for post-development quality ensurance, such as profilers and how one would use these tools to increase product quality.

    Read the article

  • Enterprise Data Quality - New and Improved on Oracle Technology Network

    - by Mala Narasimharajan
    Looking for Enterprise Data Quality technical and developer resources on your projects? Wondering where the best place is to go for finding the latest documentations, downloads and even code samples and libraries?  Check out the new and improved Oracle Technical Network pages for Oracle Enterprise Data Quality.  This section features developer forums as well for EDQ and Master Data Management so that you can connect with other technical professionals who have submitted concerns or posted tips and tricks and learn from them.  Here are the links to bookmark:    Oracle Technology Network website * NEW *   Installation Guide for Enterprise Data Quality Address Verification  Enterprise Data Quality Forum For more information on Oracle's software offerings for data quality and master data management visit:  http://www.oracle.com/us/products/applications/master-data-management/index.html http://www.oracle.com/us/products/middleware/data-integration/enterprise-data-quality/overview/index.html   

    Read the article

  • New Feature in ODI 11.1.1.6: Enterprise Data Quality Integration

    - by Julien Testut
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Oracle Data Integrator 11.1.1.6.0 introduces a new Open Tool called EnterpriseDataQuality which allows ODI users to invoke an Oracle Enterprise Data Quality Job from a Package. This post will give you an overview of this new feature. Oracle Enterprise Data Quality (OEDQ) provides organizations with an integrated suite of data quality tools that offer an end-to-end solution to measure, improve, and manage the quality of data from any domain, including customer and product data. The addition of the EnterpriseDataQuality Open Tool extends the inline Data Quality capabilities of Oracle Data Integrator with Oracle Enterprise Data Quality powerful data profiling, cleansing, matching, and monitoring capabilities. The EnterpriseDataQuality Open Tool can invoke any OEDQ Job stored in a Project. This Open Tool connects to an OEDQ server using a JMX (Java Management Extensions) interface. Once installed, this Open Tool will be found under Plugins in the Package Toolbox area: This EnterpriseDataQuality Open Tool takes a couple of parameters as inputs such as the Enterprise Data Quality Job and Project names, the OEDQ hostname and JMX port etc. With the EnterpriseDataQuality Open Tool, ODI customers can now incorporate their Oracle Enterprise Data Quality processes within their Data Integration workflows. You will find instructions about how to use the Enterprise Data Quality Open Tool in the Oracle Data Integrator documentation at: Using the EnterpriseDataQuality Open Tool.You can find an overview of all the new features introduced in ODI 11.1.1.6 in the following document: ODI 11.1.1.6 New Features Overview.

    Read the article

  • photshop: why low quality jpg saved as high quality increases the file size?

    - by Alex Angelico
    i have a low quality background about 85kb. If I open the file in Photoshop and save it, I have to save this image as 100% quality, the file size increases to 640kb. This doesn't make sense, the image si already compressed, the quality cannot be better than the source, so the 100% quality while saving should produce THE SAME FILE OPENED. The problem is, I have this background and want to add a logo image over it, and then save it. But If I do so, i have to save this with 10% quality, and the logo looks horrible. If I save this to 100% quality, the file size is huge... How can I achive this?

    Read the article

  • can't delete some files or change their attributes on my FTP server

    - by Revolter
    I've installed a CMS on a shared host running Apache, now when I was deleting the root directory with FTP, some folders left with a "Permission denied" error and I can't change their attributes. the best explanation I've got is that the CMS installer has placed the files and has assigned its ownership to the Apache server instead of my user name. (i don't know it can be done) Ijust haven't use the uninstaller because I've lost my admin password - -" so how to delete those folders ?

    Read the article

  • Is it a bad practice to add extra attributes to html elements?

    - by burak ozdogan
    Hi, Sometimes I add an attribute to some of my controls. Like: <a href id="myLlink" isClimber="True">Chris Sharma</a> I know it is not a valid html. But it helps me in some cases. Is this considered as a bad practice? A friend of mine says that it is ok for Intranet environment but on internet it might not be find friendly by search engines. If it is not a good practice, what are the best practicess? Thanks

    Read the article

  • Subterranean IL: Pseudo custom attributes

    - by Simon Cooper
    Custom attributes were designed to make the .NET framework extensible; if a .NET language needs to store additional metadata on an item that isn't expressible in IL, then an attribute could be applied to the IL item to represent this metadata. For instance, the C# compiler uses DecimalConstantAttribute and DateTimeConstantAttribute to represent compile-time decimal or datetime constants, which aren't allowed in pure IL, and FixedBufferAttribute to represent fixed struct fields. How attributes are compiled Within a .NET assembly are a series of tables containing all the metadata for items within the assembly; for instance, the TypeDef table stores metadata on all the types in the assembly, and MethodDef does the same for all the methods and constructors. Custom attribute information is stored in the CustomAttribute table, which has references to the IL item the attribute is applied to, the constructor used (which implies the type of attribute applied), and a binary blob representing the arguments and name/value pairs used in the attribute application. For example, the following C# class: [Obsolete("Please use MyClass2", true)] public class MyClass { // ... } corresponds to the following IL class definition: .class public MyClass { .custom instance void [mscorlib]System.ObsoleteAttribute::.ctor(string, bool) = { string('Please use MyClass2' bool(true) } // ... } and results in the following entry in the CustomAttribute table: TypeDef(MyClass) MemberRef(ObsoleteAttribute::.ctor(string, bool)) blob -> {string('Please use MyClass2' bool(true)} However, there are some attributes that don't compile in this way. Pseudo custom attributes Just like there are some concepts in a language that can't be represented in IL, there are some concepts in IL that can't be represented in a language. This is where pseudo custom attributes come into play. The most obvious of these is SerializableAttribute. Although it looks like an attribute, it doesn't compile to a CustomAttribute table entry; it instead sets the serializable bit directly within the TypeDef entry for the type. This flag is fully expressible within IL; this C#: [Serializable] public class MySerializableClass {} compiles to this IL: .class public serializable MySerializableClass {} For those interested, a full list of pseudo custom attributes is available here. For the rest of this post, I'll be concentrating on the ones that deal with P/Invoke. P/Invoke attributes P/Invoke is built right into the CLR at quite a deep level; there are 2 metadata tables within an assembly dedicated solely to p/invoke interop, and many more that affect it. Furthermore, all the attributes used to specify p/invoke methods in C# or VB have their own keywords and syntax within IL. For example, the following C# method declaration: [DllImport("mscorsn.dll", SetLastError = true)] [return: MarshalAs(UnmanagedType.U1)] private static extern bool StrongNameSignatureVerificationEx( [MarshalAs(UnmanagedType.LPWStr)] string wszFilePath, [MarshalAs(UnmanagedType.U1)] bool fForceVerification, [MarshalAs(UnmanagedType.U1)] ref bool pfWasVerified); compiles to the following IL definition: .method private static pinvokeimpl("mscorsn.dll" lasterr winapi) bool marshal(unsigned int8) StrongNameSignatureVerificationEx( string marshal(lpwstr) wszFilePath, bool marshal(unsigned int8) fForceVerification, bool& marshal(unsigned int8) pfWasVerified) cil managed preservesig {} As you can see, all the p/invoke and marshal properties are specified directly in IL, rather than using attributes. And, rather than creating entries in CustomAttribute, a whole bunch of metadata is emitted to represent this information. This single method declaration results in the following metadata being output to the assembly: A MethodDef entry containing basic information on the method Four ParamDef entries for the 3 method parameters and return type An entry in ModuleRef to mscorsn.dll An entry in ImplMap linking ModuleRef and MethodDef, along with the name of the function to import and the pinvoke options (lasterr winapi) Four FieldMarshal entries containing the marshal information for each parameter. Phew! Applying attributes Most of the time, when you apply an attribute to an element, an entry in the CustomAttribute table will be created to represent that application. However, some attributes represent concepts in IL that aren't expressible in the language you're coding in, and can instead result in a single bit change (SerializableAttribute and NonSerializedAttribute), or many extra metadata table entries (the p/invoke attributes) being emitted to the output assembly.

    Read the article

  • Data Quality Through Data Governance

    Data Quality Governance Data quality is very important to every organization, bad data cost an organization time, money, and resources that could be prevented if the proper governance was put in to place.  Data Governance Program Criteria: Support from Executive Management and all Business Units Data Stewardship Program  Cross Functional Team of Data Stewards Data Governance Committee Quality Structured Data It should go without saying but any successful project in today’s business world must get buy in from executive management and all stakeholders involved with the project. If management does not fully support a project because they see it is in there and the company’s best interest then they will remove/eliminate funding, resources and allocated time to work on the project. In essence they can render a project dead until it is official killed by the business. In addition, buy in from stake holders is also very important because they can cause delays increased spending in time, money and resources because they do not support a project. Data Stewardship programs are administered by a data steward manager who primary focus is to support, train and manage a cross functional data stewards team. A cross functional team of data stewards are pulled from various departments act to ensure that all systems work to ensure that an organization’s goals are achieved. Typically, data stewards are subject matter experts that act as mediators between their respective departments and IT. Data Quality Procedures Data Governance Committees are composed of data stewards, Upper management, IT Leadership and various subject matter experts depending on a company. The primary goal of this committee is to define strategic goals, coordinate activities, set data standards and offer data guidelines for the business. Data Quality Policies In 1997, Claudia Imhoff defined a Data Stewardship’s responsibility as to approve business naming standards, develop consistent data definitions, determine data aliases, develop standard calculations and derivations, document the business rules of the corporation, monitor the quality of the data in the data warehouse, define security requirements, and so forth. She further explains data stewards responsible for creating and enforcing polices on the following but not limited to issues. Resolving Data Integration Issues Determining Data Security Documenting Data Definitions, Calculations, Summarizations, etc. Maintaining/Updating Business Rules Analyzing and Improving Data Quality

    Read the article

  • Dynamic attributes with Rails and Mongoid

    - by japancheese
    Hello, I'm learning MongoDB through the Mongoid Ruby gem with Rails (Rails 3 beta 3), and I'm trying to come up with a way to create dynamic attributes on a model based on fields from another model, which I thought a schema-less database would be a good choice for. So for example, I'd have the models: class Account include Mongoid::Document field :name, :type => String field :token, :type => String field :info_needed, :type => Array embeds_many :members end class Member include Mongoid::Document embedded_in :account, :inverse_of => :members end I'm looking to take the "info_needed" attribute of the Account model and created dynamic attributes on the Member model based on what's inside. If club.info_needed was ["first_name", "last_name"], I'm trying to create a form that would save first_name and last_name attributes to the Member model. However, upon practice, I just keep getting "undefined method first_name=" errors on the Member model when trying to do this. I know MongoDB can handle dynamic attributes per record, but how can I get Mongoid to do this without an undefined method error?

    Read the article

  • DataAnnotation attributes buddy class strangeness - ASP.NET MVC

    - by JK
    Given this POCO class that was automatically generated by an EntityFramework T4 template (has not and can not be manually edited in any way): public partial class Customer { [Required] [StringLength(20, ErrorMessage = "Customer Number - Please enter no more than 20 characters.")] [DisplayName("Customer Number")] public virtual string CustomerNumber { get;set; } [Required] [StringLength(10, ErrorMessage = "ACNumber - Please enter no more than 10 characters.")] [DisplayName("ACNumber")] public virtual string ACNumber{ get;set; } } Note that "ACNumber" is a badly named database field, so the autogenerator is unable to generate the correct display name and error message which should be "Account Number". So we manually create this buddy class to add custom attributes that could not be automatically generated: [MetadataType(typeof(CustomerAnnotations))] public partial class Customer { } public class CustomerAnnotations { [NumberCode] // This line does not work public virtual string CustomerNumber { get;set; } [StringLength(10, ErrorMessage = "Account Number - Please enter no more than 10 characters.")] [DisplayName("Account Number")] public virtual string ACNumber { get;set; } } Where [NumberCode] is a simple regex based attribute that allows only digits and hyphens: [AttributeUsage(AttributeTargets.Property)] public class NumberCodeAttribute: RegularExpressionAttribute { private const string REGX = @"^[0-9-]+$"; public NumberCodeAttribute() : base(REGX) { } } NOW, when I load the page, the DisplayName attribute works correctly - it shows the display name from the buddy class not the generated class. The StringLength attribute does not work correctly - it shows the error message from the generated class ("ACNumber" instead of "Account Number"). BUT the [NumberCode] attribute in the buddy class does not even get applied to the AccountNumber property: foreach (ValidationAttribute attrib in prop.Attributes.OfType<ValidationAttribute>()) { // This collection correctly contains all the [Required], [StringLength] attributes // BUT does not contain the [NumberCode] attribute ApplyValidation(generator, attrib); } Why does the prop.Attributes.OfType<ValidationAttribute>() collection not contain the [NumberCode] attribute? NumberCode inherits RegularExpressionAttribute which inherits ValidationAttribute so it should be there. If I manually move the [NumberCode] attribute to the autogenerated class, then it is included in the prop.Attributes.OfType<ValidationAttribute>() collection. So what I don't understand is why this particular attribute does not work in when in the buddy class, when other attributes in the buddy class do work. And why this attribute works in the autogenerated class, but not in the buddy. Any ideas? Also why does DisplayName get overriden by the buddy, when StringLength does not?

    Read the article

  • Tips on a tool to measure code quality?

    - by Cristi Diaconescu
    I'm looking for a tool that can provide code quality metrics. For instance it could report very long functions (spaghetti code) very complex classes (which could contain do-it-all code) ... While we're on the (subjective:-) subject of code quality, what other code metrics would you suggest? I'm targetting C#/.NET code, but I'm sure this could extend to most programming languages.

    Read the article

  • What quality standards to consider for software development process?

    - by Ron-Damon
    Hi, i'm looking for metrics/standards/normatives to evaluate a given "Software Development Process". I'm NOT looking to evaluate the SOFTWARE itself (trough SQUARE and such), i'm trying to evaluate software development PROCESS. So, my question is if you could give me some pointers to find this standard, considering that "evaluation objetives" would be documentation quality, how good is the customer relation, how efective is the process, etc. Very much like a ISO 9000, and like CMMI on a sense, but much lightweight and concrete and process oriented, not company oriented. Please help, i'm trying to stablish the advantages of our development process as formal as i can.

    Read the article

  • How do I deal with code of bad quality contributed by a third party?

    - by lindelof
    I've recently been promoted into managing one of our most important projects. Most of the code in this project has been written by a partner of ours, not by ourselves. The code in question is of very questionable quality. Code duplication, global variables, 6-page long functions, hungarian notation, you name it. And it's in C. I want to do something about this problem, but I have very little leverage on our partner, especially since the code, for all its problems, "just works, doesn't it?". To make things worse, we're now nearing the end of this project and must ship soon. Our partner has committed a certain number of person-hours to this project and will not put in more hours. I would very much appreciate any advice or pointers you could give me on how to deal with this situation.

    Read the article

  • Delete Nodes + attributes that match Xpath except specific attributes

    - by Ryan Ternier
    I'm trying to find the best (efficient) way of doing this. I have a medium sized XML document. Depending on specific settings certain portions of it need to be filtered out for security reasons. I'll be doing this in XSLT as it's configurable and no code should need changing. I've looked around, but not getting much luck on it. For example: I have the following XPath: //*[@root='2.16.840.1.113883.3.51.1.1.6.1'] Whicrooth gives me all nodes with a root attribute equal to a specific OID. In these nodes I want to have all attributes except for a few (ex. foo and bar) erased, and then having another attribute added (ex. reason) I also need to have multiple XPath expressions that can be ran to zero down on a specific node and clear it's contents out in a similar fashion, with respect to nodes with specific attributes. I'm playing around with information from: XPath expression to select all XML child nodes except a specific list? and XSLT Remove Elements and/or Attributes by Name per XSL Parameters Will update shortly when I can have access what what I"ve done so far. Example: XML Before Transformation <root> <childNode> <innerChild root="2.16.840.1.113883.3.51.1.1.6.1" a="b" b="c" type="innerChildness"/> <innerChildSibling/> </childNode> <animals> <cat> <name>bob</name> </cat> </animals> <tree/> <water root="2.16.840.1.113883.3.51.1.1.6.1" z="zed" l="ell" type="liquidLIke"/> </root> After <root> <childNode> <innerChild root="2.16.840.1.113883.3.51.1.1.6.1" flavor="MSK"/> <!-- filtered --> <innerChildSibling/> </childNode> <animals> <cat flavor="MSK" /> <!-- cat was filtered --> </animals> <tree/> <water root="2.16.840.1.113883.3.51.1.1.6.1" flavor="MSK"/> <!-- filtered --> </root>

    Read the article

  • Putting a dollar value on code quality

    - by Chris Nelson
    As noted in another thread, "In most businesses, code quality is defined in dollars." So my company has an opportunity to acquire a large-ish C code base. Obviously, if the code quality is good, the code base is worth more than if it's poor. That is, if we can readily read, understand, and update the code, it's worth more to us than if it's a spaghetti-coded mess. Without being able to see the code ahead of time, we'd like to set some objective measure as an acceptance criteria like "If the XXX measure is below the price will be discounted YY%." What criteria can we or should we measure and what tool can we use to measure it?

    Read the article

  • Focus on Oracle Data Profiling and Data Quality 11g - 24/Fev/11

    - by Claudia Costa
    Thursday 24th February, 11am GMTOracle offers an integrated suite Data Quality software architected to discover and correct today's data quality problems and establish a platform prepared for tomorrow's yet unknown data challenges.Oracle Data Profiling provides data investigation, discovery, and profiling in support of quality, migration, integration, stewardship, and governance initiatives. It includes a broad range of features that expand upon basic profiling, including automated monitoring, business-rule validation, and trend analysis.Oracle Data Quality for Data Integrator provides cleansing, standardization, matching, address validation, location enrichment, and linking functions for global customer data and operational business data.It ensures that data adheres to established standards that are adaptable to fit each organization's specific needs. Both single - and double - byte data are processed in local languages to provide a unique and centralized view of customers, products and services.  During this in-person briefing, Data Integration Solution Specialists will be providing a technical overview and a walkthrough.Agenda Oracle Data Integration Strategy overview A focus on Oracle Data Profiling and Oracle Data Quality for Data Integrator: Oracle Data Profiling Oracle Data Quality for Data Integrator Live demo Q&A  This FREE online LIVE eSeminar will be delivered over the Web and Conference Call. Registrations received less than 24hours prior to start time may not receive confirmation to attend.To register click here.For any questions please contact [email protected]

    Read the article

  • [VB.Net] System.IO will copy files, but fails to update destinations file attributes

    - by CFP
    Hello, I have a little vb.net script that will copy a file, set its attributes to Normal, update the file time, and then set back the attributes to match those of the source file. If IO.File.Exists(Destination) Then IO.File.SetAttributes(Destination, IO.FileAttributes.Normal) IO.File.Copy(Source, Destination, True) IO.File.SetAttributes(Destination, IO.FileAttributes.Normal) IO.File.SetLastWriteTimeUtc(Destination, IO.File.GetLastWriteTimeUtc(Destination).AddHours(1)) IO.File.SetAttributes(Destination, IO.File.GetAttributes(Source)) I however I'm encountering a quite strange problem. On some configurations, IO.File.SetLastWriteTimeUtc triggers an UnauthorizedAccess error, although the IO.File.Copy instruction worked very well. I'm totally puzzled: I've checked, and file attributes are set to 128 (ie. Normal) successfully. The problem seems to be with the very SetLastWriteTimeUtc. But what is it? Any ideas? Thanks a lot!

    Read the article

  • Django models & Python class attributes

    - by Geo
    The tutorial on the django website shows this code for the models: from django.db import models class Poll(models.Model): question = models.CharField(max_length=200) pub_date = models.DateTimeField('date published') class Choice(models.Model): poll = models.ForeignKey(Poll) choice = models.CharField(max_length=200) votes = models.IntegerField() Now, each of those attribute, is a class attribute, right? So, the same attribute should be shared by all instances of the class. A bit later, they present this code: class Poll(models.Model): # ... def __unicode__(self): return self.question class Choice(models.Model): # ... def __unicode__(self): return self.choice How did they turn from class attributes into instance attributes? Did I get class attributes wrong?

    Read the article

  • What is the best way to add attributes to auto-generated entities (using VS2010 and EF4)

    - by Dani
    ASP.NET MVC2 has strong support for using attributes on entities (validation, and extending Html helper class and more). If I generated my Model from the Database using VS2010 EF4 Entity Data Model (edmx and it's cs class), And I want to add attributes on some of the entities. what would be the best practice ? how should I cope with updating the model (adding more fields / tables to the database and merging them into the edmx) - will it keep my attributes or generate a new cs file erasing everything ? (Manual changes to this file may cause unexpected behavior in your application.) (Manual changes to this file will be overwritten if the code is regenerated.)

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >