Search Results

Search found 1160 results on 47 pages for 'bluetooth pan'.

Page 38/47 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • Help trying/downloading ubuntu

    - by koolomwee
    I can't try ubuntu. Every time i put in my disk and press try, it shows the ubuntu sign with the dots under it than it shows a command screen basically blinking saying that it closed and canceled applications belonging to ubuntu... Than I tried to download it from the disk... same thing. Than, because I have windows, I tried to download it from the website and it worked. I rebooted my computer and went into the Ubuntu option just to find a blinking command screen again... How do I get ubuntu to work? What does happen (with the command screen) is that it shows commands that are blinking to fast for me to read the whole thing, and then it stops after like a minute or two of blinking. The only thing I do get to read is that Ubuntu commanded some applications to shut down. In the CD one it only has like 5 error messages, and when I reboot and select Ubuntu, there are over a hundred error messages. With the live-cd I have to shut down my computer to use it again. With the reboot and clicking ubuntu option it reboots by itself. I expect it to actually start up with no error messages. UPDATE I've been trying to start Ubuntu for the last couple of days, and I noticed on my last try that it said "SIGNAL 15 RECEIVED"... It also said that it's stopping Bluetooth and all other programs, and that its rebooting Maybe that'll help a little with answering my question... thanks :) this also might help: computer brand/model: HP Windows 7 Home Premium Service Pack 1, HPE240f graphics card: ATI Radeon HD 5570 I also wrote the same question here: https://answers.launchpad.net/wubi/+question/194537 might help a little more with information

    Read the article

  • SQL SERVER – How to Compare the Schema of Two Databases with Schema Compare

    - by Pinal Dave
    Earlier I wrote about An Efficiency Tool to Compare and Synchronize SQL Server Databases and it was very much well received. Since the blog post I have received quite a many question that just like data how we can also compare schema and synchronize it. If you think about comparing the schema manually, it is almost impossible to do so. Table Schema has been just one of the concept but if you really want the all the schema of the database (triggers, views, stored procedure and everything else) it is just impossible task. If you are developer or database administrator who works in the production environment than you know that there are so many different occasions when we have to compare schema of the database. Before deploying any changes to the production server, I personally like to make note of the every single schema change and document it so in case of any issue , I can always go back and refer my documentation. As discussed earlier it is absolutely impossible to do this task without the help of third party tools. I personally use Devart Schema Compare for this task. This is an extremely easy tool. Let us see how it works. First I have two different databases – a) AdventureWorks2012 and b) AdventureWorks2012-V1. There are total three changes between these databases. Here is the list of the same. One of the table has additional column One of the table have new index One of the stored procedure is changed Now let see how dbForge Schema Compare works in this scenario. First open dbForge Schema Compare studio. Click on New Schema Comparison. It will bring you to following screen where we have to configure the database needed to configure. I have selected AdventureWorks2012 and AdventureWorks-V1 databases. In the next screen we can verify various options but for this demonstration we will keep it as it is. We will not change anything in schema mapping screen as in our case it is not required but generically if you are comparing across schema you may need this. This is the most important screen as on this screen we select which kind of object we want to compare. You can see the options which are available to select. The screen lets you select the objects from SQL Server 2000 to SQL Server 2012. Once you click on compare in previous screen it will bring you to this screen, which will essentially display the comparative difference between two of the databases which we had selected in earlier screen. As mentioned above there are three different changes in the database and the same has been listed over here. Two of the changes belongs to the tables and one changes belong to the procedure. Let us click each of them one by one to see what is the difference between them. In very first option we can see that there is an additional column in another database which did not exist earlier. In this example we can see that AdventureWorks2012 database have an additional index. Following example is very interesting as in this case, we have changed the definition of the stored procedure and the result pan contains the same. dbForget Schema Compare very effectively identify the changes in schema and lists them neatly to developers. Here is one more screen. This software not only compares the schema but also provides the options to update or drop them as per the choice. I think this is brilliant option. Well, I have been using schema compare for quite a while and have found it very useful. Here are few of the things which dbForge Schema Compare can do for developers and DBAs. Compare and synchronize SQL Server database schemas Compare schemas of live database and SQL Server backup Generate comparison reports in Excel and HTML formats Eliminate mistakes in schema changes propagation across environments Track production database changes and customizations Automate migration of schema changes using command line interface I suggest that you try out dbForge Schema Compare and let me know what you think of this product. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL

    Read the article

  • Oracle Announces Oracle Insurance Policy Administration for Life and Annuity 9.4

    - by helen.pitts(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Today's global insurers require the ability to provide higher levels of service and quickly bring to market life insurance and annuity products that not only help them stand out from the competition, but also stay current with local legislation. To succeed, they require agile and flexible core systems that enable them to meet the unique localization requirements of the markets in which they operate, whether in North America, Asia Pacific or the Pan-European Region. The release of Oracle Insurance Policy Administration for Life and Annuity 9.4, announced today, helps insurers meet this need with expanded international market capabilities that enable them to reduce risk and profitably compete wherever their business takes them. It offers expanded multi-language along with unit-linked product and fund processing capabilities that enable regional and global insurers to rapidly configure and deliver localized products – along with providing better service for end users through a single policy admin solution. Key enhancements include: Kanji/Kana language support, pre-defined content, and imperial date processing for the Japanese market New localization flexibility for configuring and managing international mailing addresses along with regional variations for client information Enhanced capability to calculate unit-linked pricing and valuation, in addition to market-based processing and pre-configured unit linked content Expanded role-based security and masking capability to further protect sensitive customer data Enhanced capability to restrict processing specified activities based on time of day and user role, reducing exposure to market timing risks Further capability to eliminate duplicate client records, helping to reduce underwriting risks and enhance servicing through a single view of the client "The ability to leverage a single, rules-driven policy administration system for multiple global operation centers can help insurers realize significant improvements in speed to market, customer service, compliance with regional regulations, and consolidation efforts,” noted Celent's Craig Weber, senior vice president, Insurance. “We believe such initiatives are necessary to help the industry address service and distribution imperatives." Helping our customers meet these mission-critical business imperatives is a key objective for Oracle Insurance. Active, ongoing dialogue with our customers is an important part of the process to help understand how our solutions are and can continue to help them achieve success in the marketplace. I had the opportunity to meet with several of our insurance customers at the Oracle Insurance Policy Administration Client Advisory Board meeting last week in Philadelphia, Penn. (View photos on the Oracle Insurance Facebook page.)   It was a great forum for Oracle Insurance and our clients. Discussion centered on the latest business and IT trends, with opportunities to learn more about the latest release of Oracle Insurance Policy Administration for Life and Annuity and other Oracle Insurance solutions such as data warehousing / business intelligence, while exchanging best practices for product innovation and servicing customers and sales channels. Helen Pitts is senior product marketing manager for Oracle Insurance's life and annuities solutions.

    Read the article

  • SQL SERVER – Partition Parallelism Support in expressor 3.6

    - by pinaldave
    I am very excited to learn that there is a new version of expressor’s data integration platform coming out in March of this year.  It will be version 3.6, and I look forward to using it and telling everyone about it.  Let me describe a little bit more about what will be so great in expressor 3.6: Greatly enhanced user interface Parallel Processing Bulk Artifact Upgrading The User Interface First let me cover the most obvious enhancements. The expressor Studio user interface (UI) has had some significant work done. Kudos to the expressor Engineering team; the entire UI is a visual masterpiece that is very responsive and intuitive. The improvements are more than just eye candy; they provide significant productivity gains when developing expressor Dataflows. Operator shape icons now include a description that identifies the function of each operator, instead of having to guess at the function by the icon. Operator shapes and highlighting depict the current function and status: Disabled, enabled, complete, incomplete, and error. Each status displays an appropriate message in the message panel with correction suggestions. Floating or docking property panels provide descriptive tool tips for each property as well as auto resize when adjusting the canvas, without having to search Help or the need to scroll around to get access to the property. Progress and status indicators let you know when an operation is working. “No limit” canvas with snap-to-grid allows automatic sizing and accurate positioning when you have numerous operators in the Dataflow. The inline tool bar offers quick access to pan, zoom, fit and overview functions. Selecting multiple artifacts with a right click context allows you to easily manage your workspace more efficiently. Partitioning and Parallel Processing Partitioning allows each operator to process multiple subsets of records in parallel as opposed to processing all records that flow through that operator in a single sequential set. This capability allows the user to configure the expressor Dataflow to run in a way that most efficiently utilizes the resources of the hardware where the Dataflow is running. Partitions can exist in most individual operators. Using partitions increases the speed of an expressor data integration application, therefore improving performance and load times. With the expressor 3.6 Enterprise Edition, expressor simplifies enabling parallel processing by adding intuitive partition settings that are easy to configure. Bulk Artifact Upgrading Bulk Artifact Upgrading sounds a bit intimidating, but it actually is not and it is a welcome addition to expressor Studio. In past releases, users were prompted to confirm that they wanted to upgrade their individual artifacts only when opened. This was a cumbersome and repetitive process. Now with bulk artifact upgrading, a user can easily select what artifact or group of artifacts to upgrade all at once. As you can see, there are many new features and upgrade options that will prove to make expressor Studio quicker and more efficient.  I hope I’m not the only one who is excited about all these new upgrades, and that I you try expressor and share your experience with me. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Normalisation and 'Anima notitia copia' (Soul of the Database)

    - by Phil Factor
    (A Guest Editorial for Simple-Talk) The other day, I was staring  at the sys.syslanguages  table in SQL Server with slightly-raised eyebrows . I’d just been reading Chris Date’s  interesting book ‘SQL and Relational Theory’. He’d made the point that you’re not necessarily doing relational database operations by using a SQL Database product.  The same general point was recently made by Dino Esposito about ASP.NET MVC.  The use of ASP.NET MVC doesn’t guarantee you a good application design: It merely makes it possible to test it. The way I’d describe the sentiment in both cases is ‘you can hit someone over the head with a frying-pan but you can’t call it cooking’. SQL enables you to create relational databases. However,  even if it smells bad, it is no crime to do hideously un-relational things with a SQL Database just so long as it’s necessary and you can tell the difference; not only that but also only if you’re aware of the risks and implications. Naturally, I’ve never knowingly created a database that Codd would have frowned at, but around the edges are interfaces and data feeds I’ve written  that have caused hissy fits amongst the Normalisation fundamentalists. Part of the problem for those who agonise about such things  is the misinterpretation of Atomicity.  An atomic value is one for which, in the strange virtual universe you are creating in your database, you don’t have any interest in any of its component parts.  If you aren’t interested in the electrons, neutrinos,  muons,  or  taus, then  an atom is ..er.. atomic. In the same way, if you are passed a JSON string or XML, and required to store it in a database, then all you need to do is to ask yourself, in your role as Anima notitia copia (Soul of the database) ‘have I any interest in the contents of this item of information?’.  If the answer is ‘No!’, or ‘nequequam! Then it is an atomic value, however complex it may be.  After all, you would never have the urge to store the pixels of images individually, under the misguided idea that these are the atomic values would you?  I would, of course,  ask the ‘Anima notitia copia’ rather than the application developers, since there may be more than one application, and the applications developers may be designing the application in the absence of full domain knowledge, (‘or by the seat of the pants’ as the technical term used to be). If, on the other hand, the answer is ‘sure, and we want to index the XML column’, then we may be in for some heavy XML-shredding sessions to get to store the ‘atomic’ values and ensure future harmony as the application develops. I went back to looking at the sys.syslanguages table. It has a months column with the months in a delimited list January,February,March,April,May,June,July,August,September,October,November,December This is an ordered list. Wicked? I seem to remember that this value, like shortmonths and days, is treated as a ‘thing’. It is merely passed off to an external  C++ routine in order to format a date in a particular language, and never accessed directly within the database. As far as the database is concerned, it is an atomic value.  There is more to normalisation than meets the eye.

    Read the article

  • Managing Operational Risk of Financial Services Processes – part 1/ 2

    - by Sanjeevio
    Financial institutions view compliance as a regulatory burden that incurs a high initial capital outlay and recurring costs. By its very nature regulation takes a prescriptive, common-for-all, approach to managing financial and non-financial risk. Needless to say, no longer does mere compliance with regulation will lead to sustainable differentiation.  Genuine competitive advantage will stem from being able to cope with innovation demands of the present economic environment while meeting compliance goals with regulatory mandates in a faster and cost-efficient manner. Let’s first take a look at the key factors that are limiting the pursuit of the above goal. Regulatory requirements are growing, driven in-part by revisions to existing mandates in line with cross-border, pan-geographic, nature of financial value chains today and more so by frequent systemic failures that have destabilized the financial markets and the global economy over the last decade.  In addition to the increase in regulation, financial institutions are faced with pressures of regulatory overlap and regulatory conflict. Regulatory overlap arises primarily from two things: firstly, due to the blurring of boundaries between lines-of-businesses with complex organizational structures and secondly, due to varying requirements of jurisdictional directives across geographic boundaries e.g. a securities firm with operations in US and EU would be subject different requirements of “Know-Your-Customer” (KYC) as per the PATRIOT ACT in US and MiFiD in EU. Another consequence and concomitance of regulatory change is regulatory conflict, which again, arises primarily from two things: firstly, due to diametrically opposite priorities of line-of-business and secondly, due to tension that regulatory requirements create between shareholders interests of tighter due-diligence and customer concerns of privacy. For instance, Customer Due Diligence (CDD) as per KYC requires eliciting detailed information from customers to prevent illegal activities such as money-laundering, terrorist financing or identity theft. While new customers are still more likely to comply with such stringent background checks at time of account opening, existing customers baulk at such practices as a breach of trust and privacy. As mentioned earlier regulatory compliance addresses both financial and non-financial risks. Operational risk is a non-financial risk that stems from business execution and spans people, processes, systems and information. Operational risk arising from financial processes in particular transcends other sources of such risk. Let’s look at the factors underpinning the operational risk of financial processes. The rapid pace of innovation and geographic expansion of financial institutions has resulted in proliferation and ad-hoc evolution of back-office, mid-office and front-office processes. This has had two serious implications on increasing the operational risk of financial processes: ·         Inconsistency of processes across lines-of-business, customer channels and product/service offerings. This makes it harder for the risk function to enforce a standardized risk methodology and in turn breaches harder to detect. ·         The proliferation of processes coupled with increasingly frequent change-cycles has resulted in accidental breaches and increased vulnerability to regulatory inadequacies. In summary, regulatory growth (including overlap and conflict) coupled with process proliferation and inconsistency is driving process compliance complexity In my next post I will address the implications of this process complexity on financial institutions and outline the role of BPM in lowering specific aspects of operational risk of financial processes.

    Read the article

  • The Evolution of Oracle Direct EMEA by John McGann

    - by user769227
    John is expanding his Dublin based team and is currently recruiting a Director with marketing and sales leadership experience: http://bit.ly/O8PyDF Should you wish to apply, please send your CV to [email protected] Hi, my name is John McGann and I am part of the Oracle Direct management team, based in Dublin.   Today I’m writing from the Oracle London City office, right in the heart of the financial district and up to very recently at the centre of a fantastic Olympic Games. The Olympics saw individuals and teams from across the globe competing to decide who is Citius, Altius, Fortius - “Faster, Higher, Stronger" There are lots of obvious parallels between the competitive world of the Olympics and the Business environments that many of us operate in, but there are also some interesting differences – especially in my area of responsibility within Oracle. We are of course constantly striving to be the best - the best solution on offer for our clients, bringing simplicity to their management, consumption and application of information technology, and the best provider when compared with our many niche competitors.   In Oracle and especially in Oracle Direct, a key aspect of how we achieve this is what sets us apart from the Olympians.  We have long ago eliminated geographic boundaries as a limitation to what we can achieve. We assemble the strongest individuals across multiple countries and bring them together in teams focussed on a single goal. One such team is the Oracle Direct Sales Programs team. In case you don’t know, Oracle Direct EMEA (Europe Middle East and Africa) is the inside sales division in Oracle and it is where I started my Oracle career.  I remember that my first role involved putting direct mail in envelopes.... things have moved on a bit since then – for me, for Oracle Direct and in how we interact with our customers. Today, the team of over 1000 people is located in the different Oracle Direct offices around Europe – the main ones are Malaga, Berlin, Prague and Dubai plus the headquarters in Dublin. We work in over 20 languages and are in constant contact with current and future Oracle customers, using the latest internet and telephone technologies to effectively communicate and collaborate with each other, our customers and prospects. One of my areas of responsibility within Oracle Direct is the Sales Programs team. This team of 25 people manages the planning and execution of demand generation, leading the process of finding new and incremental revenue within Oracle Direct. The Sales Programs Managers or ‘SPMs’ are embedded within each of the Oracle Direct sales teams, focussed on distinct geographies or product groups. The SPMs are virtual members of the regional sales management teams, and work closely with the sales and marketing teams to define and deliver demand generation activities. The customer contact elements of these activities are executed via the Oracle Direct Sales and Business Development/Lead Generation teams, to deliver the pipeline required to meet our revenue goals. Activities can range from pan-EMEA joint sales and marketing campaigns, to very localised niche campaigns. The campaigns might focus on particular segments of our existing customers, introducing elements of our evolving solution portfolio which customers may not be familiar with. The Sales Programs team also manages ‘Nurture’ activities to ensure that we develop potential business opportunities with contacts and organisations that do not have immediate requirements. Looking ahead, it is really important that we continue to evolve our ability to add value to our clients and reduce the physical limitations of our distance from them through the innovative application of technology. This enables us to enhance the customer buying experience and to enable the Inside Sales teams to manage ever more complex sales cycles from start to finish.  One of my expectations of my team is to actively drive innovation in how we leverage data to better understand our customers, and exploit emerging technologies to better communicate with them.   With the rate of innovation and acquisition within Oracle, we need to ensure that existing and potential customers are aware of all we have to offer that relates to their business goals.   We need to achieve this via a coherent communication and sales strategy to effectively target the right people using the most effective medium. This is another area where the Sales Programs team plays a key role.

    Read the article

  • SR Activity Summaries Via Direct Email? You Bet!

    - by PCat
    Courtesy of Ken Walker. I’m a “bottom line” kind of guy.  My friends and co-workers will tell you that I’m a “Direct Communicator” when it comes to work or my social life.  For example, if I were to come up with a fantastic new recipe for a low-fat pan fried chicken, I’d Tweet, email, or find a way to blast the recipe directly to you so that you could enjoy it immediately.  My friends would see the subject, “Awesome New Fried Chicken” and they’d click and see the recipe there before them.Others are “Indirect Communicators.”  My friend Joel is like this.  He would post the recipe in his blog, and then Tweet or email a link back to his blog with a subject, “Fried Chicken.”  Then Joel would sit back and expect his friends to read the email, AND click the link to his blog, and then read the recipe.  As a fan of the “Direct” method, I wish there was a way for me to “Opt-in” for immediate updates from Joel so I could see the recipe without having to click over to his blog to search for it.The same is true for MOS.  If you’ve ever opened a Service Request through My Oracle Support (MOS), you know that most of the communication between you and the Oracle Support Engineer with respect to the issue in the SR, is done via email.  Which type of email would you rather receive in your email account? Example1:Your SR has been updated.  Click HERE to see the update. Or Example2:Your SR has been updated.  Here is the update:  “Hi John, Oracle Development has completed the patch we’ve been waiting for!  Here’s a direct “LINK” to the patch that should resolve your issue.  Please download and install the patch via the instructions (included with the link) and let me know if it does, in fact, resolve your issue!”Example2 is available to you!  All you need to do is to “Opt-In” for the direct email updates.  The default is for the indirect update as seen in Example1.  To turn on “Service Request Details in Email” simply follow these instructions (aided by the screenshot below):1.    Log into MOS, and click on your name in the upper right corner.  Select “My Account.”2.    Make sure “My Account” is highlighted in bold on the left.3.    Turn ON, “Service Request Details in Email” That’s it!  You will now receive the SR Updates, directly in your email account without having to log into MOS, click the SR, scroll down to the updates, etc.  That’s better than Fried Chicken!  (Well; almost better....).

    Read the article

  • Thinking Local, Regional and Global

    - by Apeksha Singh-Oracle
    The FIFA World Cup tournament is the biggest single-sport competition: it’s watched by about 1 billion people around the world. Every four years each national team’s manager is challenged to pull together a group players who ply their trade across the globe. For example, of the 23 members of Brazil’s national team, only four actually play for Brazilian teams, and the rest play in England, France, Germany, Spain, Italy and Ukraine. Each country’s national league, each team and each coach has a unique style. Getting all these “localized” players to work together successfully as one unit is no easy feat. In addition to $35 million in prize money, much is at stake – not least national pride and global bragging rights until the next World Cup in four years time. Achieving economic integration in the ASEAN region by 2015 is a bit like trying to create the next World Cup champion by 2018. The team comprises Brunei Darussalam, Cambodia, Indonesia, Lao PDR, Malaysia, Myanmar, Philippines, Singapore, Thailand and Vietnam. All have different languages, currencies, cultures and customs, rules and regulations. But if they can pull together as one unit, the opportunity is not only great for business and the economy, but it’s also a source of regional pride. BCG expects by 2020 the number of firms headquartered in Asia with revenue exceeding $1 billion will double to more than 5,000. Their trade in the region and with the world is forecast to increase to 37% of an estimated $37 trillion of global commerce by 2020 from 30% in 2010. Banks offering transactional banking services to the emerging market place need to prepare to repond to customer needs across the spectrum – MSMEs, SMEs, corporates and multi national corporations. Customers want innovative, differentiated, value added products and services that provide: • Pan regional operational independence while enabling single source of truth at a regional level • Regional connectivity and Cash & Liquidity  optimization • Enabling Consistent experience for their customers  by offering standardized products & services across all ASEAN countries • Multi-channel & self service capabilities / access to real-time information on liquidity and cash flows • Convergence of cash management with supply chain and trade finance While enabling the above to meet customer demands, the need for a comprehensive and robust credit management solution for effective regional banking operations is a must to manage risk. According to BCG, Asia-Pacific wholesale transaction-banking revenues are expected to triple to $139 billion by 2022 from $46 billion in 2012. To take advantage of the trend, banks will have to manage and maximize their own growth opportunities, compete on a broader scale, manage the complexity within the region and increase efficiency. They’ll also have to choose the right operating model and regional IT platform to offer: • Account Services • Cash & Liquidity Management • Trade Services & Supply Chain Financing • Payments • Securities services • Credit and Lending • Treasury services The core platform should be able to balance global needs and local nuances. Certain functions need to be performed at a regional level, while others need to be performed on a country level. Financial reporting and regulatory compliance are a case in point. The ASEAN Economic Community is in the final lap of its preparations for the ultimate challenge: becoming a formidable team in the global league. Meanwhile, transaction banks are designing their own hat trick: implementing a world-class IT platform, positioning themselves to repond to customer needs and establishing a foundation for revenue generation for years to come. Anand Ramachandran Senior Director, Global Banking Solutions Practice Oracle Financial Services Global Business Unit

    Read the article

  • jqtouch / google maps api v3 issue

    - by Rigo Vides
    Hi everyone, I'm having a hard time triying to make jQtouch and Google Maps api V3 together. I've tried almost everything. It seems that the only source of information is here I checked every single post. I'm starting with jQuery and css... so there's a lot of stuff I don't understand. First of all, I'm using jQtouch framework to build a web app with google maps integration, the problem is that whenever I pan the map, strange flickering occurs. It's like jQtouch and the map are trying to fight for a callback. I'm using the latest revision out there. And have several bugs, regarding transition/animation functionality (since I don't wrap the map and all the element's divs within the #jqt wrapper) and css issues with some styling. Anyone has successfully achieve a functional setup build for this scenario? (google maps api v3 & jqtouch), I think is not necessary to paste you some code (But if you think is necessary please let me know and I'll do it), If you paste me a minimal example with a map (detailing jqtouch version, and modification to the styles/.js files), and some transition back and forth from the map to another div/section/page, you'll get the bounty. Thanks a lot in advance. And please, let me know if this is kind of 'legal' here, I mean, there's too loose information on the official wiki, and like 13 'solutions' but nothing concrete... I'm just triying to help anyone who step on in this problem in the future.

    Read the article

  • iPhone: how do i redraw subviews while pinch zooming a uiscrollview

    - by Mike
    I am developing an iPhone app that places multiple custom UIViews as subviews in a UIScrollView. The subviews are placed on top of each other as transparent views as each view has its own drawing routines that traces parts of the base view. The base view is a UIImageView that is typically a large image that I want the user to be able to pan and zoom in and out of. The problem I am having is that when I zoom in and out of my UIScrollView, the subviews do not redraw themselves while the user is zooming. I can reposition and scale the subviews properly once the zoom is completed, but the user experience is less than desirable. I have not been able to find a way to either hide or redraw the subviews as the zoom is taking place to scale the subviews along with the ImageView. Any ideas? thanks! Here is the code that I have implemented: - (void)scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(float)scale { for (UIView *view in subViews) { [view updateView:scale]; } } - (UIView *)viewForZoomingInScrollView:(UIScrollView *) scrollView { return imageView; }

    Read the article

  • How to update a QPixmap in a QGraphicsView with PyQt

    - by pops
    I am trying to paint on a QPixmap inside a QGraphicsView. The painting works fine, but the QGraphicsView doesn't update it. Here is some working code: #!/usr/bin/env python from PyQt4 import QtCore from PyQt4 import QtGui class Canvas(QtGui.QPixmap): """ Canvas for drawing""" def __init__(self, parent=None): QtGui.QPixmap.__init__(self, 64, 64) self.parent = parent self.imH = 64 self.imW = 64 self.fill(QtGui.QColor(0, 255, 255)) self.color = QtGui.QColor(0, 0, 0) def paintEvent(self, point=False): if point: p = QtGui.QPainter(self) p.setPen(QtGui.QPen(self.color, 1, QtCore.Qt.SolidLine)) p.drawPoints(point) def clic(self, mouseX, mouseY): self.paintEvent(QtCore.QPoint(mouseX, mouseY)) class GraphWidget(QtGui.QGraphicsView): """ Display, zoom, pan...""" def __init__(self): QtGui.QGraphicsView.__init__(self) self.im = Canvas(self) self.imH = self.im.height() self.imW = self.im.width() self.zoomN = 1 self.scene = QtGui.QGraphicsScene(self) self.scene.setItemIndexMethod(QtGui.QGraphicsScene.NoIndex) self.scene.setSceneRect(0, 0, self.imW, self.imH) self.scene.addPixmap(self.im) self.setScene(self.scene) self.setTransformationAnchor(QtGui.QGraphicsView.AnchorUnderMouse) self.setResizeAnchor(QtGui.QGraphicsView.AnchorViewCenter) self.setMinimumSize(400, 400) self.setWindowTitle("pix") def mousePressEvent(self, event): if event.buttons() == QtCore.Qt.LeftButton: pos = self.mapToScene(event.pos()) self.im.clic(pos.x(), pos.y()) #~ self.scene.update(0,0,64,64) #~ self.updateScene([QtCore.QRectF(0,0,64,64)]) self.scene.addPixmap(self.im) print('items') print(self.scene.items()) else: return QtGui.QGraphicsView.mousePressEvent(self, event) def wheelEvent(self, event): if event.delta() > 0: self.scaleView(2) elif event.delta() < 0: self.scaleView(0.5) def scaleView(self, factor): n = self.zoomN * factor if n < 1 or n > 16: return self.zoomN = n self.scale(factor, factor) if __name__ == '__main__': import sys app = QtGui.QApplication(sys.argv) widget = GraphWidget() widget.show() sys.exit(app.exec_()) The mousePressEvent does some painting on the QPixmap. But the only solution I have found to update the display is to make a new instance (which is not a good solution). How do I just update it?

    Read the article

  • Virtual Earth (Bing) Pin "moves" when zoom level changes

    - by Ali
    Hi guys, Created a Virtual Earth (Bing) map to show a simple pin at a particular point. Everything works right now - the pin shows up, the title and description pop up on hover. The map is initially fully zoomed into the pin, but the STRANGE problem is that when I zoom out it moves slightly lower on the map. So if I started with the pin pointing somewhere in Toronto, if I zoom out enough the pin ends up i the middle of Lake Ontario! If I pan the map, the pin correctly stays in its proper location. When I zoom back in, it moves slightly up until it's back to its original correct position! I've looked around for a solution for a while, but I can't understand it at all. Please help!! Thanks a lot! import with javascript: http://ecn.dev.virtualearth.net/mapcontrol/mapcontrol.ashx?v=6.2 $(window).ready(function(){ GetMap(); }); map = new VEMap('birdEye'); map.SetCredentials("hash key from Bing website"); map.LoadMap(new VELatLong(43.640144 ,-79.392593), 1 , VEMapStyle.BirdseyeHybrid, false, VEMapMode.Mode2D, true, null); var pin = new VEShape(VEShapeType.Pushpin, new VELatLong(43.640144 ,-79.392593)); pin.SetTitle("Goes to Title of the Pushpin"); pin.SetDescription("Goes as Description."); map.AddShape(pin);

    Read the article

  • WPF 3D extrude "a bitmap"

    - by Bgnt44
    Hi, I'm looking for a way to simulate a projector in wpf 3D : i've these "in" parameters : beam shape : a black and white bitmap file beam size ( ex : 30 °) beam color beam intensity ( dimmer ) projector position (x,y,z) beam position (pan(x),tilt(y) relative to projector) First i was thinking of using light object but it seem that wpf can't do that So, now i think that i can make for each projector a polygon from my bitmap... First i need to convert the black and white bitmap to vector. Only Simple shape ( bubble, line,dot,cross ...) Is there any WPF way to do that ? Or maybe a external program file (freeware); then i need to build the polygon, with the shape of the converted bitmap , color , size , orientation in parameter. i don't know how can i defined the lenght of the beam , and if it can be infiny ... To show the beam result, i think of making a room ( floor , wall ...) and beam will end to these wall... i don't care of real light render ( dispersion ...) but the scene render has to be real time and at least 15 times / second (with probably from one to 100 projectors at the same time), information about position, angle,shape,color will be sent for each render... Well so, i need sample for that, i guess that all of these things could be useful for other people If you have sample code : Convert Bitmap to vector Extrude vectors from one point with a angle parameter until collision of a wall Set x,y position of the beam depend of the projector position Set Alpha intensity of the beam, color Maybe i'm totally wrong and WPF is not ready for that , so advise me about other way ( xna,d3D ) with sample of course ;-) Thanks you

    Read the article

  • TFS and Forms Authentication

    - by George
    I don't know squat about TFS, other than as a user who has performed simple check in/outs. I just installed it locally and would like to do joint development with a friend. I was having trouble making my TFS web site on port 8080 visible (the whole scoop is here if your interested) and I wonder if it could be related to the fact that TFS is probably using Windows Authentication to identify the user. Can TFS be set up to use forms authentication? We probably need to set up a VPN, though that's a learning curve too. To use TFS, do our machines have to belong to a domain? We're not admin types, though he is better than me, though I would be interested in any feedback or advice on which path is likely to pan out the best. I already got AxoSoft OneTime working in this type of an environment and it suits us well, but I am tempted at all the bells & whistles with TFS and the ability to tie tracked bug items to code changes. As far as finding a good way to share code, do sites like SourceForge allow one to keep code secure among members only?

    Read the article

  • How can I draw on JPanel using another quadrant for the coordinates?

    - by Sanoj
    I would like to draw some shapes on a JPanel by overriding paintComponent. I would like to be able to pan and zoom. Panning and zooming is easy to do with AffineTransform and the setTransform method on the Graphics2D object. After doing that I can easyli draw the shapes with g2.draw(myShape) The shapes are defined with the "world coordinates" so it works fine when panning and I have to translate them to the canvas/JPanel coordinates before drawing. Now I would like to change the quadrant of the coordinates. From the 4th quadrant that JPanel and computer often uses to the 1st quadrant that the users are most familiar with. The X is the same but the Y-axe should increase upwards instead of downwards. It is easy to redefine origo by new Point(origo.x, -origo.y); But How can I draw the shapes in this quadrant? I would like to keep the coordinates of the shapes (defined in the world coordinates) rather than have them in the canvas coordinates. So I need to transform them in some way, or transform the Graphics2D object, and I would like to do it efficiently. Can I do this with AffineTransform too?

    Read the article

  • Creating form using Generic_inlineformset_factory from the Model Form

    - by Prateek
    hello dear all, I wanted to create a edit form with the help of ModelForm. and my models contain a Generic relation b/w classes, so if any one could suggest me the view and a bit of template for the purpose I would be very thankful, as I am new to the language. My models look like:- class Employee(Person): nickname = models.CharField(_('nickname'), max_length=25, null=True, blank=True) blood_type = models.CharField(_('blood group'), max_length=3, null=True, blank=True, choices=BLOOD_TYPE_CHOICES) marital_status = models.CharField(_('marital status'), max_length=1, null=True, blank=True, choices=MARITAL_STATUS_CHOICES) nationality = CountryField(_('nationality'), default='IN', null=True, blank=True) about = models.TextField(_('about'), blank=True, null=True) dependent = models.ManyToManyField(Dependent, through='DependentRelationship') pan_card_number = models.CharField(_('PAN card number'), max_length=50, blank=True, null=True) policy_number = models.CharField(_('policy number'), max_length=50, null=True, blank=True) # code specific details user = models.OneToOneField(User, blank=True, null=True, verbose_name=_('user')) class Person(models.Model): """Person model""" title = models.CharField(_('title'), max_length=20, null=True, blank=True) first_name = models.CharField(_('first name'), max_length=100) middle_name = models.CharField(_('middle name'), max_length=100, null=True, blank=True) last_name = models.CharField(_('last name'), max_length=100, null=True, blank=True) suffix = models.CharField(_('suffix'), max_length=20, null=True, blank=True) slug = models.SlugField(_('slug'), max_length=50, unique=True) class PhoneNumber(models.Model) : phone_number = generic.GenericRelation('PhoneNumber') email_address = generic.GenericRelation('EmailAddress') address = generic.GenericRelation('Address') date_of_birth = models.DateField(_('date of birth'), null=True, blank=True) gender = models.CharField(_('gender'), max_length=1, null=True, blank=True, choices=GENDER_CHOICES) content_type = models.ForeignKey(ContentType, If anyone could suggest me a link or so. it would be a great help........

    Read the article

  • Pass Result of ASIHTTPRequest "requestFinished" Back to Originating Method

    - by Intelekshual
    I have a method (getAllTeams:) that initiates an HTTP request using the ASIHTTPRequest library. NSURL *httpURL = [[[NSURL alloc] initWithString:@"/api/teams" relativeToURL:webServiceURL] autorelease]; ASIHTTPRequest *request = [[[ASIHTTPRequest alloc] initWithURL:httpURL] autorelease]; [request setDelegate:self]; [request startAsynchronous]; What I'd like to be able to do is call [WebService getAllTeams] and have it return the results in an NSArray. At the moment, getAllTeams doesn't return anything because the HTTP response is evaluated in the requestFinished: method. Ideally I'd want to be able to call [WebService getAllTeams], wait for the response, and dump it into an NSArray. I don't want to create properties because this is disposable class (meaning it doesn't store any values, just retrieves values), and multiple methods are going to be using the same requestFinished (all of them returning an array). I've read up a bit on delegates, and NSNotifications, but I'm not sure if either of them are the best approach. I found this snippet about implementing callbacks by passing a selector as a parameter, but it didn't pan out (since requestFinished fires independently). Any suggestions? I'd appreciate even just to be pointed in the right direction. NSArray *teams = [[WebService alloc] getAllTeams]; (currently doesn't work, because getAllTeams doesn't return anything, but requestFinished does. I want to get the result of requestFinished and pass it back to getAllTeams:)

    Read the article

  • Bilinear interpolation - DirectX vs. GDI+

    - by holtavolt
    I have a C# app for which I've written GDI+ code that uses Bitmap/TextureBrush rendering to present 2D images, which can have various image processing functions applied. This code is a new path in an application that mimics existing DX9 code, and they share a common library to perform all vector and matrix (e.g. ViewToWorld/WorldToView) operations. My test bed consists of DX9 output images that I compare against the output of the new GDI+ code. A simple test case that renders to a viewport that matches the Bitmap dimensions (i.e. no zoom or pan) does match pixel-perfect (no binary diff) - but as soon as the image is zoomed up (magnified), I get very minor differences in 5-10% of the pixels. The magnitude of the difference is 1 (occasionally 2)/256. I suspect this is due to interpolation differences. Question: For a DX9 ortho projection (and identity world space), with a camera perpendicular and centered on a textured quad, is it reasonable to expect DirectX.Direct3D.TextureFilter.Linear to generate identical output to a GDI+ TextureBrush filled rectangle/polygon when using the System.Drawing.Drawing2D.InterpolationMode.Bilinear setting? For this (magnification) case, the DX9 code is using this (MinFilter,MipFilter set similarly): Device.SetSamplerState(0, SamplerStageStates.MagFilter, (int)TextureFilter.Linear); and the GDI+ path is using: g.InterpolationMode = InterpolationMode.Bilinear; I thought that "Bilinear Interpolation" was a fairly specific filter definition, but then I noticed that there is another option in GDI+ for "HighQualityBilinear" (which I've tried, with no difference - which makes sense given the description of "added prefiltering for shrinking") Followup Question: Is it reasonable to expect pixel-perfect output matching between DirectX and GDI+ (assuming all external coordinates passed in are equal)? If not, why not? Finally, there are a number of other APIs I could be using (Direct2D, WPF, GDI, etc.) - and this question generally applies to comparing the output of "equivalent" bilinear interpolated output images across any two of these. Thanks!

    Read the article

  • Google Maps in Drupal: reference to gmap object from JavaScript

    - by user280817
    Is there a way to obtain JavaScript references to the Google maps that are embedded into Drupal pages by the GMap module? I want to be able to manipulate the maps in these pages. I want to pan and zoom them. But I cannot find a reference to an embedded map object. I've dissected the relevant JavaScript objects Drupal.gmap and Drupal.settings.gmap with no success--unless I've overlooked something. The Drupal GMap module doesn't seem to explicitly provide references (within its API) to the GMap objects that it embeds into pages. It just generates themed text which is interpolated into the page. The technique of passing the HTML ID of the map container to either the GMap2 object constructor or the similar Drupal.gmap.getMap() function in order to obtain a map reference doesn't appear to work: Both simply return an instance to a new map, one having the same dimensions and basic characteristics of the original map, but apparently sans all of its overlays (which could contain markers). And I have to call setCenter() on it before I can use it, which initializes the structure, so I know it has no overlays.

    Read the article

  • Approaches to replace cursor in pure AS3 / Flare project?

    - by peteorpeter
    Hi there good lookin, I've got a pure AS3 (no Flex) project that uses Flare to display and interact with a data visualization. I just implemented some panning behavior, so you can click and drag the visualization around, and now I'd like to give the user a visual indicator that this is possible, by switching the arrow cursor with a nice grabby-looking hand icon. The user can click and drag at any time except when the mouse is over a clickable node (at which time the cursor swaps to a pointer - this behavior is already in place). So... 1) Do I need to create my own custom bitmap/sprite or is there a palette of built-in cursors I can use? (I'm not using Flex.) 2) Is there a way to simply replace the default arrow with the pan cursor project-wide, or do I need to attach the swapping to mouse events on display objects? Can I use the stage object to make this behavior apply everywhere? 3) How do I perform the swap? Do I use the Cursor object directly or do I need to get involved with the CursorManager? Any guidance, pseudo-code, or words of wisdom is greatly appreciated!

    Read the article

  • OpenGL ES - how to keep some object at a fixed size?

    - by OMH
    I'm working on a little game in OpenGL ES. In the background, there is a world/map. The map is just a large texture. Zoom/pinch/pan is used to move around. And I'm using glOrthof (left, right, bottom, top, zNear, zFar) to implement the zoom/pinch. When I zoom in, the sprites on top of the map is also zoomed in. But I would like to have some sprites stay at a fixed size. I could probably calculate a scale factor, depending on the parameters to glOrthof, but there must be a more natural and straightforward way of doing that, instead of scaling the sprites down when I zoom in. If I add some text or some GUI elements on top of the map, they should definately have a fixed size. Is there a solution to do this, or do I have to leave fixed values in glOrthof and implement zoom/pinch in another way? EDIT: To be more clear: I want sprites that zoom in/out along with the map, but stay at the same size. I have some elements that are like the pins on the iPhone's map application. When you zoom, the pins stay the same size, but move around on the screen to stay on the same spot on the map. That is mainly what I want a solution for. Solutions for this already came below, thanks!

    Read the article

  • How to do drag-n-drop and resize on an image using jQuery?

    - by Scott
    How do I resize and image using jQuery but keep its aspect ratio the same? <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en"> <head> <title>CrossSlide - A jQuery plugin to create pan and cross-fade animations</title> <link href="http://ajax.googleapis.com/ajax/libs/jqueryui/1.8/themes/base/jquery-ui.css" rel="stylesheet" type="text/css"/> <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js"></script> <script src="http://ajax.googleapis.com/ajax/libs/jqueryui/1.8/jquery-ui.min.js"></script> </head> <body> <style type="text/css"> #resizebleImage { background: silver; } </style> <script type="text/javascript"> $(document).ready(function(){ $("#resizebleImage").resizable().parent().draggable(); }); </script> <img id="resizebleImage" src="http://images.askmen.com/galleries/singer/gloria-estefan/pictures/gloria-estefan-picture-4.jpg"> </body> </html>

    Read the article

  • Excessive use of Inner Join for more than 3 tables

    - by Archangel08
    Good Day, I have 4 tables on my DB (not the actual name but almost similar) which are the ff: employee,education,employment_history,referrence employee_id is the name of the foreign key from employee table. Here's the example (not actual) data: **Employee** ID Name Birthday Gender Email 1 John Smith 08-15-2014 Male [email protected] 2 Jane Doe 00-00-0000 Female [email protected] 3 John Doe 00-00-0000 Male [email protected] **Education** Employee_ID Primary Secondary Vocation 1 Westside School Westshore H.S SouthernBay College 2 Eastside School Eastshore H.S NorthernBay College 3 Northern School SouthernShore H.S WesternBay College **Employment_History** Employee_ID WorkOne StartDate Enddate 1 StarBean Cafe 12-31-2012 01-01-2013 2 Coffebucks Cafe 11-01-2012 11-02-2012 3 Latte Cafe 01-02-2013 04-05-2013 Referrence Employee_ID ReferrenceOne Address Contact 1 Abraham Lincoln Lincoln Memorial 0000000000 2 Frankie N. Stein Thunder St. 0000000000 3 Peter D. Pan Neverland Ave. 0000000000 NOTE: I've only included few columns though the rest are part of the query. And below are the codes I've been working on for 3 consecutive days: $sql=mysql_query("SELECT emp.id,emp.name,emp.birthday,emp.pob,emp.gender,emp.civil,emp.email,emp.contact,emp.address,emp.paddress,emp.citizenship,educ.employee_id,educ.elementary,educ.egrad,educ.highschool,educ.hgrad,educ.vocational,educ.vgrad,ems.employee_id,ems.workOne,ems.estartDate,ems.eendDate,ems.workTwo,ems.wstartDate,ems.wendDate,ems.workThree,ems.hstartDate,ems.hendDate FROM employee AS emp INNER JOIN education AS educ ON educ.employee_id='emp.id' INNER JOIN employment_history AS ems ON ems.employee_id='emp.id' INNER JOIN referrence AS ref ON ref.employee_id='emp.id' WHERE emp.id='$id'"); Is it okay to use INNER JOIN this way? Or should I modify my query to get the results that I wanted? I've also tried to use LEFT JOIN but still it doesn't return anything .I didn't know where did I go wrong. You see, as I have thought, I've been using the INNER JOIN in correct manner, (since it was placed before the WHILE CLAUSE). So I couldn't think of what could've possible went wrong. Do you guys have a suggestion? Thanks in advance.

    Read the article

  • CSS positioning div above another div when not in that order in the HTML

    - by devmode
    Given a template where the HTML cannot be modified because of other requirements, how is it possible to display a div above another div when they are not in that order in the HTML and both divs contain data that could produce a varying height and width. HTML: <div id="wrapper"     <div id="firstDiv"         Content to be below in this situation     </div     <div id="secondDiv"         Content to be above in this situation     </div </div Other elements Hopefully it is obvious that the desired result is: Content to be above in this situation Content to be below in this situation Other elements When the dimensions are fixed it easy to position them where needed, but I need some ideas for when the content is variable. For the sake of this scenario, please just consider the width to be 100% on both. Edit: A CSS solution is the most ideal solution. Thank you for the Javascript options mentioned. Without getting too wordy about what or why (or who) ... I am specifically looking for a CSS only solution (and it will probably have to be met with other solutions if that doesn't pan out). One more ... there are other elements following this. A good suggestion was mentioned given the limited scenario I demonstrated -- given that it might be the best answer, but I am looking to also make sure elements following this aren't impacted.

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >