Will these replace fansubs?

Will these replace fansubs?

Other urls found in this thread:

youtube.com/watch?v=g82tUyukLck
youtu.be/gmCgISZCJ78
twitter.com/NSFWRedditVideo

With their translator that actually got less accurate over the years, no.

Why replace something that already died?

Judging by the way Youtube fucks up auto-captions on Youtube anime streams, not any time soon.

>twitter screencap
I can do literally the same thing and still need subs. Makes you think don't it?

Remember sage is a downvote button

No.

Machine translation struggles with context, subtlety, negation and innuendo. It should have no problem translating anime whatsoever.

10/10

Isn't that the guy who couldn't get through the tutorial in Cuphead?

When you play the devil's game, you got to pay the price.

Holy shit it actually works.

It is going replicate their actual translator available online but now it sound, it won't be accurate at all, it won't replace one's effort to go on and learn a new language. It's not some miraculous gadget from Star Trek like an infallible universal translator.

Wow, a twitter thread.

I think you made a typo, because you surely did not mean what you just posted.

Yes, dude has two decades of "professionalism" on his hands, he will post news on just anything to do with technology.

If huffington post made a news post titled "donald just called hillary a nigger" and you took a screenshot about it and posted on Sup Forums would that be a huff post thread?

>spends 30 minutes putting earbuds in
>"Elitist audiophiles are ruining listening"

Yes.

I would definitely be a shit thread

I don't know understand what are you talking about, sorry.
I have never used twitter FYI.

somebody post the webm

why is that working

I wonder if google translator will be banned by the UN one it will cause war because of shit translation.

It definitely wouldn't belong on Sup Forums at least.

Will the translation software translate in the same voice and tone for the VAs as well as keep with the timing?

Neural networks, unconstrained, produce some pretty terrible results.

Dunno. Maybe google devs go to pol?

There's obviously going to some Microsoft San shit and there will be seconds delay like any UN speech.

That "simultaneous translation" is definitively a stretch.

...

Anything to get rid of attention whores

...

I searched that and this was the most useful thing I found in a sea of Sup Forums screencaps on reddit.
www.somaliaonline.com/community/topic/do-we-really-say-ooga-booga-shooga/

Try detect language, it works just as well.

...

What happened to this? youtube.com/watch?v=g82tUyukLck

>tfw your language is so fucked up that there will never be automated subs

It's like Sup Forums has someone inside goolag... I wonder who could it be.

Is it the grammatical cases/sijamuodot that screw up google? Or just too few speakers and every finn just speaking english on the internet.

We'll never know. Ooger booger mein neger.

Remember google before people learned to abuse SEO? Fuck, you could find anything so long as you had a vague description in your head. Now just even some of the most basic searches turn up nothing but garbage.

Nothing is stronger than the shekel.
One good thing though, if you try to stream anime the first two pages will be nothing but paid ads and "not available outside us".

Mr. Polle would do better than this if he was really trying.

What did google mean by this?

...

Wait, this is the guy that sucks at Cuphead.

Not in quite some time.

I can't imagine listening to synthetized voice reading mechanically inaccurate translation will be a hit.

Great, we have the physical tech. Now we just need a competent automatic translator. Glad they spent the time and money loading Google Translate onto a pair of earbuds. Google really is leading the industry in technology.

You want to listen to a robot recite everything said onscreen?

You know what they say.
>Great minds discuss ideas; average minds discuss events; small minds discuss people.

>buy a video game from somalia
>boot it up
>press the ooga booga gaga nigga button to continue

>press the play button button to continue

Cultural differences.
Wow.

> watching dubs
> wanting the entire dub to be played by just Siri / Stephen Hawkings

That's like having your favorite English show dubbed with nothing but this shit

youtu.be/gmCgISZCJ78

>press any key

Google has somewhat good TTS (text-to-speech); the default (female) voices can pronounce things and have proper intonations without sounding like a robot. I don't use GAPPS, but I always download Google's TTS so I can use it with my offline maps that I use for navigation; shit fucking works wonders, and it's better than the standard Pico TTS that comes with Android.
It's still TTS, though. It's no replacement for a real person.
Not to mention that Google Translate is god-awful in the translation department; they should've worked on improving the quality of the translations, and they should've polished the OCR tech they acquired from the developers of Word Lens.

remember when google had a filter for "search discussion" looking for boards? that were the times

That goes against the idea of assimilating something already useful and taking advantage of it.

I can't wait to hear my anime in glorious DUWANG

great minds complete the tutorial without any trouble

>Dean Takashi
Sorry for this post.

>they should've worked on improving the quality of the translations
Not really going to be possible until we're on the doorstep of human-equivalent general AI.

Fair enough, but if they gave a fuck they could make it even better. But there is probably a limit to what they could allow you to have on a free application.

I guess there is a limit to what they'd give you on a free application*

What is human?

I remember when they first shilled the switch to neural networks as some super-correct translation of Japanese and I got a giggle out of it, since people who don't know the first thing about the language ate it all up saying translators are worthless now.

Then it got released in the local language, I waited a few weeks and then told them about it. "Wait. But it's as terrible as it's always been?! Practically unusable. What happened?"

Any decent anime translation involves so much making shit up a dumb as bricks neural network that simply equates phrases is never going to do shit but DUWANG-tier transcriptions.

Well Japanese is the Dark Souls of languages

This shit is gold

A huge bundle of neurons, some senses, and a fuckton of social programming.

>somali programmers

I hate using this word this way, but it neutral net technology was the meme of 2015. Everything started bundling it, and people lapped it all up.

>dumb as bricks neural network that simply equates phrases
The whole point of the neural network method is that it doesn't simply equate phrases like old methods did. It's an attempt to mimic the way humans translate, but it generally fails in cases where simple substitution is inadequate because it's lacking the contextual understanding and theory of mind that human translators have.

So a process we don't and probably can't understand, taken to the complexity of several billion units, in hierarchy.

They can achieve a lot of great things, but they're still a tool that needs to be used correctly or else it simply creates a mess (and hordes of idiots who view it as a black box that magically solves all problems).

It's been stated for decades and people are still not able to get it. These things require strong social context. Anything not researching ways to either algorithmise or at least account for that is inevitable simply fighting over scraps at some shitty 10% translation rate.

Shake it

It hooks up to Google, so it uses Google translate. Which is shit.

This tech will be pretty damn good in like 5 years but right now, Japanese-> English or a few other languages like Finnish or Hungarian is insanely bad

Neural network is the future. The problem with traditional translation software was that it relied heavily upon preprogrammed syntax and vocabulary. But natural language has so many variations exceptions which make that approach extremely difficult at best, to impossible once you include all of the potential slang and acronyms which are constantly being invented.
The new approach is instead to process massive amounts of text from a variety of sources to organically generate translations which can be kept up to date. Of course it's still really new, and shit like ooga booga above happens because there is insufficient sources, resulting in Sup Forums shit becoming official translations. Its similar to the problem which caused North Korea's supreme leader to be translated as Mr Squidward.

The problems with current translation-focused neural networks run far deeper than having insufficient source material.

>The whole point of the neural network method is that it doesn't simply equate phrases like old methods did.

I read a really lengthy article on this thing and that's exactly what it does. The original tried painstakingly translating each word, breaking up the sentence structure and maybe getting something through another dictionary using that sentence structure and translated words.
This thing simply drills phrases into the network to get them into some internal pseudo-language which then can pick the corresponding phrases in the other language.

The old one would (slight exagerration) take "Ohayou!" and translate it as "Early!". The modern one takes it, uses the network to get some pseudo-language representation and then returns "Good morning!" because that's the most frequently encountered translation in English.

>twitter
Fuck off.

(((Google)))
I'm good.

post the goat .webm while you're at it

>tired
>see a google bike laying around
>ride it to work

Google bikes are fun to ride.

how about context and not transliteration?

Not for a few years, You'll be fine with languages in your natives own family
But for for a Germanic we should still get a good translation for romance languages but for anything other than that? Fuck no.

Remember when you search images it actually worked?

Good thing I don't need this
t. raw-fag

...

I thought it was just me, reverse image search seems to not work at all anymore.

yawn

Give me some google glasses with built in subbing and we'll talk.

Pretty stupid saying considering it's discussing people itself.

>suggest an edit
>screencap the edit
Kurapika was a she too until the google translator bot took it down

My favorite is when I'm trying to search for something that happens to be the title of a song or part of some famous line from song lyrics and there's absolutely no way to filter out the 30+ pages of song results and get to what I want to see.

I love discussing people. Am I stupid?

>replace fansubs
A little late for that.

...

Translation to and from their internal pseudo-language doesn't operate on simple substitution, though. That's the advantage of the neural network—it can work with sentence and passage context and derive its own patterns from those when fed enough samples. You'll only see simple substitution in very basic cases like that one where there's a near-universal translation.

>generic shitty artificial voice replacing text + original voices
>for 159*infintiy the price of normal subs
nope

>uploading your voice samples to NSA/Google for a fee of 160 bones

dumb goyim will purchase this

Where do you think you are, OP?

...