Unity uses their own AOT compiler IL2CPP that translates C# code to C++. Mono is an option in builds, but most projects are using IL2CPP nowadays and have been moving to .Net core. Although C++ is king, C# has quickly caught up to be a close 2nd. Unity is used to make 50% of all mobile, pc, and consoles games (according to Unity https://unity.com/our-company).
The comparison to a WW2 tank in the Black Forest isn't great though, because the environments live on different timescales.
A forest can regrow and thrive after a few decades. The same is _not_ true for the deep sea[1].
'Life on the ocean floor moves at a glacial pace. Sediment accumulates at a rate of 1 millimeter every millennium. With such a slow rate of growth, areas disturbed by deep-sea mining would be unlikely to recover on a reasonable timescale.'
Are they mining in sensitive areas? Isn't there an equivalent of an underwater deserts with little life/biodiversity? Not to diminish this idea, but the ocean covers >70% of the earth so I would think this could be done somewhere with relatively little impact, but maybe I am wrong.
This area with nothing in it sounds like the oil spill “outside the environment” in the very excellent comedy skit “The Front Fell Off”.
https://youtu.be/3m5qxZm_JqM
Edit: Linked to official channel. Thank you for the correction.
Not really. We already mine on land that covers only 30% of earth, I would think that would have a much bigger impact. Second, and oil spill spreads far and wide contaminating a vast area. All I am saying is mining can in theory be contained to a much smaller area.
And no fish nurseries in deep waters mean not mesopelagic fish. We could distroy it in a week, and the negative consequences over the human famine would be basically permanent. It just does not worth it.
+1 On this, and I'd want to encourage them to provide this feedback throughout the internship as well, not just at the end. Sure, sometimes they'll have suggestions that we've considered and decided not to do, but if it's a common question we should document.
For helping the interns, I'd talk to them about what they learned, what their next career steps are, and what subjects they're most interested in. Ideally, you have a mentorship program setup already where this has already been discussed, so this may end up being very brief. But even without a mentorship program, having good discussion about how you can help the interns aside from just having an extra line of experience on the resume is healthy for both sides.
For improving the program for future interns, I'd also ask them what they wished they knew when they first started their internship. For all of the above, make sure that it's a safe environment to have honest discussion - for instance state outright that they're not going to burn bridges by pointing out process flaws or talking about working at other companies in the future.
Yeah, I think this is what I'm looking to focus on. We're a small startup, so interns are a new thing. So I'm going to focus a lot on how it felt settling in, getting up to speed. What can we do better to get people comfortable faster? What were their biggest pain points in the summer.
It's just interesting framing these to someone who may not have much experience to compare it to.
I had a blast with TIS-100 and Shenzhen I/O when they came out, but for some reason I never picked up Infinifactory until yesterday. I'm so glad I did.
Zachtronics games give me that feeling of zen that is increasingly harder to find in games nowadays - the "Oh dang it is 2am already, I thought it was like 9pm!".
I'd recommend checking out the aforementioned titles, as well as some of the 'other games' on the website[1], like Ruckingenur-II.
Don't forget SpaceChem! It's less-obviously-programming compared to some of the others but it has very similarly-compelling puzzles. It might be my favorite from Zachtronics.
That's the difference between a phonemic and phonetic transcription [0]. The former is bracketed with slashes (//), and the latter with square brackets ([]). To use Wikipedia's example:
Phonemic transcription of English "pot" is /pɔt/, and "spot" is /spɔt/. However, a phonetic transcription is closer to [pʰɔt̚] and [spɔt̚]. Note the addition of aspiration on the initial /p/ in "pot", and marking the /t/ as unreleased in both. But you don't need to differentiate stop aspiration in English, because they are conditioned allophones. And all final stops in English are unreleased, so you don't need to mark those phonemically either.
Basically, your first sentence is entirely correct, but your second sentence is a bit inappropriate. It serves it's purpose of a standard, but you must keep in mind that most transcriptions are not phonetic but phonemic. And a phonemic reading is not meant to be independent of the language -- there are plenty of languages for which [p] and [pʰ] are not allophones and must be distinguished in even a phonemic reading. You should be familiar with the phonemic conventions of a language in order to actually render a phonetic reading.
And even then, a phonetic reading will be dependent on things that boil down to individual speaker differences. That is, there is not any standardized, singular mapping of a phonemic transcription to a phonetic one, in either direction.
> Phonemic transcription of English "pot" is /pɔt/, and "spot" is /spɔt/.
Your use if this particular IPA indicates that you speak a dialect that has undergone the cot-caught merger, which probably means your accent is from Scotland, the western half of the US, or the Boston area. If you didn't, then you'd transcribe those words as /pɑt/ and /spɑt/ if you spoke a dialect with the father-bother merger (most American accents) or /pɒt/ and /spɒt/ if you spoke a dialect without it (many English accents, including RP).
I'm leaning towards you being Scottish, since Scottish English tends to merge them to [ɔ], while American dialects with the merger tend to merge them to [ɑ].
This goes to show that IPA transcriptions are heavily dependent on dialect.
I did transcribe it wrong; it's phonetically [ɒ] for me -- cot-caught is in effect.
Though this does give opportunity to point out that even phonemic transcriptions can have dialectal differences. There is no one phonemic transcription for English, as the different vowel mergers mean that minimal pairs differ between groups. This kind of variation tends not to happen as much with consonants as vowels, which make them much harder to transcribe even phonemically. But, for example, here's a list of mergers for non-rhotic dialects:
There is also a liiiiitle difference between RP's and SA's [i] and [ɪ] (IPA sounds are "areas", not "points"), as well as a difference in tenseness (British is generally more tense).
So yeah, IPA doesn't completely specify how things sound.
I am working with IPA annotated contents and I have linguists in my team. I understand your point (or at least I think I do), but I don't entirely concur. IPA tries to very accurate and thus covers all languages (even the click consonants used by some endangered African tribes). When you have this kind of precision employed in the context of a given natural language, you get the problem of having multiple phonetic transcriptions for the same word or syllable, all valid because from the prospective of a speaker of that language all sound the same. Yet, only one option is the closest to the pronunciation formally supported by a given authority. All you can do is to pick that one when you have to provide, and to just tolerate all the rest when you have to accept. I can hardly imagine something better than what IPA is already offering without loosing the neutral nature, so I'm yet to find valid critiques at IPA's address.
Although I don't see eye tracking in games as 'the next big thing' in any sense (the few I've tried were severely lacking in the fun department), I would be interested to try playing Soma with this.
Soma had some flaws to be sure, but damn did it do some things really well. I'd highly recommend it - eye tracking or no.
One great application I can think of is depth of field adjustment. Some games have depth of field simulation, which makes far away scenery blurry. Personally I always disable it because I think it belongs to the uncanny valley: it's nice on screenshots, but it doesn't work so well in-game because the focus doesn't change with where/what you're looking at. Eye tracking, by making this feature dynamic, would I think enhance 3D rendering for relatively cheap.
Good use of eye tracking could also allow for foveated rendering, drawing the out-of-focus areas with lower resolution and leaving more time to render the areas of interest.
Yeah. I hope I will be able to pick one up for roughly the price they announced here. I could see this getting bumped up to $300+ from people buying just to resell!
The biggest player that uses it is Unity, but even they have their own special fork of Mono to get it to play nice.
Full Disclosure: I work for a Microsoft/Xbox Studio.